Mar 22 00:08:48 crc systemd[1]: Starting Kubernetes Kubelet... Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 22 00:08:49 crc kubenswrapper[5116]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.417222 5116 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421411 5116 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421439 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421443 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421447 5116 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421451 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421454 5116 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421460 5116 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421463 5116 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421467 5116 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421470 5116 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421474 5116 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421478 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421482 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421486 5116 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421489 5116 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421494 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421501 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421505 5116 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421509 5116 feature_gate.go:328] unrecognized feature gate: Example Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421512 5116 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421517 5116 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421522 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421527 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421533 5116 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421536 5116 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421539 5116 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421543 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421546 5116 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421549 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421552 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421556 5116 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421559 5116 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421562 5116 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421565 5116 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421569 5116 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421572 5116 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421575 5116 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421578 5116 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421581 5116 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421585 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421588 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421592 5116 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421596 5116 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421600 5116 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421604 5116 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421607 5116 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421610 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421615 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421618 5116 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421621 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421624 5116 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421629 5116 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421634 5116 feature_gate.go:328] unrecognized feature gate: Example2 Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421637 5116 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421640 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421644 5116 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421647 5116 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421652 5116 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421655 5116 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421660 5116 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421663 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421667 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421671 5116 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421675 5116 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421678 5116 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421681 5116 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421684 5116 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421687 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421690 5116 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421693 5116 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421697 5116 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421700 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421705 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421709 5116 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421713 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421718 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421721 5116 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421724 5116 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421728 5116 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421731 5116 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421734 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421737 5116 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421740 5116 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421744 5116 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421747 5116 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.421750 5116 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422296 5116 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422306 5116 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422310 5116 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422315 5116 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422319 5116 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422323 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422327 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422331 5116 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422335 5116 feature_gate.go:328] unrecognized feature gate: Example2 Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422339 5116 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422342 5116 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422347 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422351 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422355 5116 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422359 5116 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422364 5116 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422367 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422371 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422375 5116 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422379 5116 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422385 5116 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422389 5116 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422393 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422397 5116 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422401 5116 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422404 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422408 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422411 5116 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422414 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422417 5116 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422421 5116 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422424 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422427 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422430 5116 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422433 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422436 5116 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422440 5116 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422444 5116 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422448 5116 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422452 5116 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422456 5116 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422461 5116 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422465 5116 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422469 5116 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422474 5116 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422477 5116 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422481 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422486 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422490 5116 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422493 5116 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422497 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422500 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422504 5116 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422508 5116 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422511 5116 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422515 5116 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422519 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422522 5116 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422525 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422529 5116 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422533 5116 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422537 5116 feature_gate.go:328] unrecognized feature gate: Example Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422541 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422559 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422565 5116 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422569 5116 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422573 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422578 5116 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422582 5116 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422586 5116 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422590 5116 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422593 5116 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422599 5116 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422604 5116 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422609 5116 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422613 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422617 5116 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422624 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422629 5116 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422634 5116 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422639 5116 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422643 5116 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422647 5116 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422651 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422655 5116 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.422660 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.423969 5116 flags.go:64] FLAG: --address="0.0.0.0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.423984 5116 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.423997 5116 flags.go:64] FLAG: --anonymous-auth="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424003 5116 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424008 5116 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424012 5116 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424018 5116 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424023 5116 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424027 5116 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424031 5116 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424035 5116 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424040 5116 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424043 5116 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424047 5116 flags.go:64] FLAG: --cgroup-root="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424051 5116 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424055 5116 flags.go:64] FLAG: --client-ca-file="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424059 5116 flags.go:64] FLAG: --cloud-config="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424062 5116 flags.go:64] FLAG: --cloud-provider="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424066 5116 flags.go:64] FLAG: --cluster-dns="[]" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424072 5116 flags.go:64] FLAG: --cluster-domain="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424075 5116 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424079 5116 flags.go:64] FLAG: --config-dir="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424082 5116 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424086 5116 flags.go:64] FLAG: --container-log-max-files="5" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424091 5116 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424095 5116 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424099 5116 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424103 5116 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424107 5116 flags.go:64] FLAG: --contention-profiling="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424111 5116 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424114 5116 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424119 5116 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424122 5116 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424133 5116 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424138 5116 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424142 5116 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424146 5116 flags.go:64] FLAG: --enable-load-reader="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424150 5116 flags.go:64] FLAG: --enable-server="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424154 5116 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424159 5116 flags.go:64] FLAG: --event-burst="100" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424164 5116 flags.go:64] FLAG: --event-qps="50" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424193 5116 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424199 5116 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424204 5116 flags.go:64] FLAG: --eviction-hard="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424210 5116 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424213 5116 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424217 5116 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424222 5116 flags.go:64] FLAG: --eviction-soft="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424225 5116 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424230 5116 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424235 5116 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424242 5116 flags.go:64] FLAG: --experimental-mounter-path="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424251 5116 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424255 5116 flags.go:64] FLAG: --fail-swap-on="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424260 5116 flags.go:64] FLAG: --feature-gates="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424266 5116 flags.go:64] FLAG: --file-check-frequency="20s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424271 5116 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424276 5116 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424281 5116 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424286 5116 flags.go:64] FLAG: --healthz-port="10248" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424290 5116 flags.go:64] FLAG: --help="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424298 5116 flags.go:64] FLAG: --hostname-override="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424303 5116 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424307 5116 flags.go:64] FLAG: --http-check-frequency="20s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424312 5116 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424316 5116 flags.go:64] FLAG: --image-credential-provider-config="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424326 5116 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424331 5116 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424335 5116 flags.go:64] FLAG: --image-service-endpoint="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424340 5116 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424344 5116 flags.go:64] FLAG: --kube-api-burst="100" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424348 5116 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424353 5116 flags.go:64] FLAG: --kube-api-qps="50" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424358 5116 flags.go:64] FLAG: --kube-reserved="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424363 5116 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424367 5116 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424372 5116 flags.go:64] FLAG: --kubelet-cgroups="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424377 5116 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424382 5116 flags.go:64] FLAG: --lock-file="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424386 5116 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424390 5116 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424395 5116 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424402 5116 flags.go:64] FLAG: --log-json-split-stream="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424406 5116 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424410 5116 flags.go:64] FLAG: --log-text-split-stream="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424414 5116 flags.go:64] FLAG: --logging-format="text" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424418 5116 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424423 5116 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424427 5116 flags.go:64] FLAG: --manifest-url="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424431 5116 flags.go:64] FLAG: --manifest-url-header="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424438 5116 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424442 5116 flags.go:64] FLAG: --max-open-files="1000000" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424448 5116 flags.go:64] FLAG: --max-pods="110" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424456 5116 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424461 5116 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424465 5116 flags.go:64] FLAG: --memory-manager-policy="None" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424470 5116 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424475 5116 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424481 5116 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424488 5116 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhel" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424501 5116 flags.go:64] FLAG: --node-status-max-images="50" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424505 5116 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424508 5116 flags.go:64] FLAG: --oom-score-adj="-999" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424512 5116 flags.go:64] FLAG: --pod-cidr="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424516 5116 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc2b30e70040205c2536d01ae5c850be1ed2d775cf13249e50328e5085777977" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424525 5116 flags.go:64] FLAG: --pod-manifest-path="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424529 5116 flags.go:64] FLAG: --pod-max-pids="-1" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424532 5116 flags.go:64] FLAG: --pods-per-core="0" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424536 5116 flags.go:64] FLAG: --port="10250" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424540 5116 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424543 5116 flags.go:64] FLAG: --provider-id="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424547 5116 flags.go:64] FLAG: --qos-reserved="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424551 5116 flags.go:64] FLAG: --read-only-port="10255" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424555 5116 flags.go:64] FLAG: --register-node="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424559 5116 flags.go:64] FLAG: --register-schedulable="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424562 5116 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424569 5116 flags.go:64] FLAG: --registry-burst="10" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424573 5116 flags.go:64] FLAG: --registry-qps="5" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424576 5116 flags.go:64] FLAG: --reserved-cpus="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424580 5116 flags.go:64] FLAG: --reserved-memory="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424584 5116 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424588 5116 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424591 5116 flags.go:64] FLAG: --rotate-certificates="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424595 5116 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424598 5116 flags.go:64] FLAG: --runonce="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424602 5116 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424608 5116 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424612 5116 flags.go:64] FLAG: --seccomp-default="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424615 5116 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424619 5116 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424626 5116 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424630 5116 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424634 5116 flags.go:64] FLAG: --storage-driver-password="root" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424638 5116 flags.go:64] FLAG: --storage-driver-secure="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424642 5116 flags.go:64] FLAG: --storage-driver-table="stats" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424646 5116 flags.go:64] FLAG: --storage-driver-user="root" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424650 5116 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424654 5116 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424657 5116 flags.go:64] FLAG: --system-cgroups="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424661 5116 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424667 5116 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424671 5116 flags.go:64] FLAG: --tls-cert-file="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424674 5116 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424681 5116 flags.go:64] FLAG: --tls-min-version="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424684 5116 flags.go:64] FLAG: --tls-private-key-file="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424688 5116 flags.go:64] FLAG: --topology-manager-policy="none" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424691 5116 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424695 5116 flags.go:64] FLAG: --topology-manager-scope="container" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424699 5116 flags.go:64] FLAG: --v="2" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424705 5116 flags.go:64] FLAG: --version="false" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424710 5116 flags.go:64] FLAG: --vmodule="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424715 5116 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.424719 5116 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424813 5116 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424817 5116 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424821 5116 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424824 5116 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424828 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424833 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424837 5116 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424840 5116 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424843 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424847 5116 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424851 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424854 5116 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424858 5116 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424861 5116 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424865 5116 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424868 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424871 5116 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424874 5116 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424877 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424881 5116 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424884 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424887 5116 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424890 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424893 5116 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424896 5116 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424900 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424903 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424906 5116 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424910 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424914 5116 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424917 5116 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424920 5116 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424923 5116 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424926 5116 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424930 5116 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424933 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424936 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424941 5116 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424944 5116 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424947 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424950 5116 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424954 5116 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424957 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424960 5116 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424963 5116 feature_gate.go:328] unrecognized feature gate: Example2 Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424968 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424971 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424974 5116 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424977 5116 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424980 5116 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424983 5116 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424986 5116 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424990 5116 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424993 5116 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.424996 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425000 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425003 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425006 5116 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425010 5116 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425013 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425016 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425020 5116 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425024 5116 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425028 5116 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425032 5116 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425035 5116 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425038 5116 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425041 5116 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425045 5116 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425050 5116 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425053 5116 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425056 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425060 5116 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425063 5116 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425066 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425069 5116 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425072 5116 feature_gate.go:328] unrecognized feature gate: Example Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425076 5116 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425080 5116 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425083 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425086 5116 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425090 5116 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425095 5116 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425098 5116 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425101 5116 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.425105 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.426593 5116 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.437059 5116 server.go:530] "Kubelet version" kubeletVersion="v1.33.5" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.437443 5116 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437501 5116 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437508 5116 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437512 5116 feature_gate.go:328] unrecognized feature gate: Example Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437516 5116 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437520 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437523 5116 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437528 5116 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437534 5116 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437538 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437541 5116 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437545 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437549 5116 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437552 5116 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437556 5116 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437559 5116 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437562 5116 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437565 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437569 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437572 5116 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437575 5116 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437578 5116 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437584 5116 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437588 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437592 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437595 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437598 5116 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437602 5116 feature_gate.go:328] unrecognized feature gate: Example2 Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437606 5116 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437610 5116 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437614 5116 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437617 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437621 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437626 5116 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437630 5116 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437634 5116 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437638 5116 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437642 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437646 5116 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437649 5116 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437654 5116 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437658 5116 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437662 5116 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437665 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437668 5116 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437672 5116 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437675 5116 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437678 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437681 5116 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437684 5116 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437688 5116 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437691 5116 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437694 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437697 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437701 5116 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437705 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437708 5116 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437712 5116 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437715 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437719 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437722 5116 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437725 5116 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437729 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437732 5116 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437735 5116 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437738 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437741 5116 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437746 5116 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437750 5116 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437753 5116 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437756 5116 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437759 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437763 5116 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437767 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437770 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437773 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437776 5116 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437780 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437783 5116 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437786 5116 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437790 5116 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437793 5116 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437796 5116 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437799 5116 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437802 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437805 5116 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437809 5116 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.437816 5116 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437918 5116 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437924 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437928 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437931 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437935 5116 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437938 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437942 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437945 5116 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437948 5116 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437952 5116 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437957 5116 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437960 5116 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437964 5116 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437967 5116 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437970 5116 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437973 5116 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437977 5116 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437981 5116 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437984 5116 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437987 5116 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437990 5116 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437994 5116 feature_gate.go:328] unrecognized feature gate: Example Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.437997 5116 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438000 5116 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438004 5116 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438007 5116 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438010 5116 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438013 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438016 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438019 5116 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438023 5116 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438026 5116 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438030 5116 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438033 5116 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438036 5116 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438040 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438043 5116 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438046 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438049 5116 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438053 5116 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438056 5116 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438060 5116 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438063 5116 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438066 5116 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438070 5116 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438073 5116 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438076 5116 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438079 5116 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438083 5116 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438086 5116 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438089 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438092 5116 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438096 5116 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438099 5116 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438102 5116 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438105 5116 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438108 5116 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438112 5116 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438116 5116 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438121 5116 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438125 5116 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438129 5116 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438132 5116 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438136 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438139 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438142 5116 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438146 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438150 5116 feature_gate.go:328] unrecognized feature gate: Example2 Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438153 5116 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438156 5116 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438159 5116 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438177 5116 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438181 5116 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438185 5116 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438189 5116 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438194 5116 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438198 5116 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438202 5116 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438206 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438210 5116 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438215 5116 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438219 5116 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438223 5116 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438228 5116 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438232 5116 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 22 00:08:49 crc kubenswrapper[5116]: W0322 00:08:49.438236 5116 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.438243 5116 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.439022 5116 server.go:962] "Client rotation is on, will bootstrap in background" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.442845 5116 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2025-12-03 08:27:53 +0000 UTC" logger="UnhandledError" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.445936 5116 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.446032 5116 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.446982 5116 server.go:1019] "Starting client certificate rotation" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.447126 5116 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.447225 5116 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.471825 5116 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.474959 5116 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.475368 5116 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.488138 5116 log.go:25] "Validated CRI v1 runtime API" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.535685 5116 log.go:25] "Validated CRI v1 image API" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.539447 5116 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.542995 5116 fs.go:135] Filesystem UUIDs: map[19e76f87-96b8-4794-9744-0b33dca22d5b:/dev/vda3 2026-03-22-00-02-24-00:/dev/sr0 5eb7c122-420e-4494-80ec-41664070d7b6:/dev/vda4 7B77-95E7:/dev/vda2] Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.543042 5116 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:44 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:45 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.557471 5116 manager.go:217] Machine: {Timestamp:2026-03-22 00:08:49.555508433 +0000 UTC m=+0.577809826 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33649926144 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:80bc4fba336e4ca1bc9d28a8be52a356 SystemUUID:6f1c9f09-bd93-4412-afb3-903004a8bcf7 BootID:4e17d39b-4bf4-4f5d-b01b-aaffc38eb890 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16824963072 Type:vfs Inodes:4107657 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6729986048 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6545408 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16824963072 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:44 Capacity:3364990976 Type:vfs Inodes:821531 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:45 Capacity:1073741824 Type:vfs Inodes:4107657 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:12:96:9a Speed:0 Mtu:1500} {Name:br-int MacAddress:b2:a9:9f:57:07:84 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:12:96:9a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:7b:9f:99 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:bb:8d:58 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:4c:e6:53 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c1:20:11 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:a6:05:59:a8:19:71 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:d6:f7:76:37:16:06 Speed:0 Mtu:1500} {Name:tap0 MacAddress:5a:94:ef:e4:0c:ee Speed:10 Mtu:1500}] Topology:[{Id:0 Memory:33649926144 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.557711 5116 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.558044 5116 manager.go:233] Version: {KernelVersion:5.14.0-570.57.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20251021-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.560238 5116 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.560280 5116 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.560520 5116 topology_manager.go:138] "Creating topology manager with none policy" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.560536 5116 container_manager_linux.go:306] "Creating device plugin manager" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.560561 5116 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.562648 5116 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.562981 5116 state_mem.go:36] "Initialized new in-memory state store" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.563183 5116 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.567522 5116 kubelet.go:491] "Attempting to sync node with API server" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.567631 5116 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.567661 5116 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.567682 5116 kubelet.go:397] "Adding apiserver pod source" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.567698 5116 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.573620 5116 state_checkpoint.go:81] "State checkpoint: restored pod resource state from checkpoint" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.573663 5116 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.577827 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.577823 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.578705 5116 state_checkpoint.go:81] "State checkpoint: restored pod resource state from checkpoint" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.578725 5116 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.587409 5116 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.5-3.rhaos4.20.gitd0ea985.el9" apiVersion="v1" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.587865 5116 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-server-current.pem" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.588687 5116 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589615 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589636 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589644 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589650 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589657 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589664 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589671 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589678 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589687 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589700 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.589710 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.590493 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.590543 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.590553 5116 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.594127 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.612285 5116 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.612864 5116 server.go:1295] "Started kubelet" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.612850 5116 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.612921 5116 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.613452 5116 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.613838 5116 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 22 00:08:49 crc systemd[1]: Started Kubernetes Kubelet. Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.616046 5116 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.616490 5116 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.617577 5116 server.go:317] "Adding debug handlers to kubelet server" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.618317 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.619199 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.619340 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.619542 5116 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.619155 5116 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189f013aa607e028 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.612529704 +0000 UTC m=+0.634831087,LastTimestamp:2026-03-22 00:08:49.612529704 +0000 UTC m=+0.634831087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624214 5116 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624278 5116 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624326 5116 factory.go:153] Registering CRI-O factory Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624361 5116 factory.go:223] Registration of the crio container factory successfully Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624441 5116 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624452 5116 factory.go:55] Registering systemd factory Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624458 5116 factory.go:223] Registration of the systemd container factory successfully Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624475 5116 factory.go:103] Registering Raw factory Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.624486 5116 manager.go:1196] Started watching for new ooms in manager Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.625469 5116 manager.go:319] Starting recovery of all containers Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.648737 5116 manager.go:324] Recovery completed Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.669732 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.671628 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.671697 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.671710 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.672440 5116 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.672462 5116 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.672487 5116 state_mem.go:36] "Initialized new in-memory state store" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.680047 5116 policy_none.go:49] "None policy: Start" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.680547 5116 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.680566 5116 state_mem.go:35] "Initializing new in-memory state store" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.694252 5116 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.696083 5116 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.696130 5116 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.696157 5116 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.696211 5116 kubelet.go:2451] "Starting kubelet main sync loop" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.697836 5116 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.698116 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705055 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="34177974-8d82-49d2-a763-391d0df3bbd8" volumeName="kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705113 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705139 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705150 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705161 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705199 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705211 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705223 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705239 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705282 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705328 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705337 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705346 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705378 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705392 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705411 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705420 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705430 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705453 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705483 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42a11a02-47e1-488f-b270-2679d3298b0e" volumeName="kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705494 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705506 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705516 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" volumeName="kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705526 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705534 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705544 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705583 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7e2c886-118e-43bb-bef1-c78134de392b" volumeName="kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705594 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="428b39f5-eb1c-4f65-b7a4-eeb6e84860cc" volumeName="kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705619 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705630 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17b87002-b798-480a-8e17-83053d698239" volumeName="kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705639 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705649 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705661 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705668 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705686 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" volumeName="kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705694 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705703 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705712 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a14caf222afb62aaabdc47808b6f944" volumeName="kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705725 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705757 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705779 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.705788 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706006 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706016 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e093be35-bb62-4843-b2e8-094545761610" volumeName="kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706027 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5ebfebf6-3ecd-458e-943f-bb25b52e2718" volumeName="kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706036 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706044 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706088 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706096 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706126 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706147 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706158 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706185 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706198 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706208 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706218 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706247 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706294 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706318 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706328 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706336 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706347 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706358 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706369 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706379 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706388 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706410 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706420 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" volumeName="kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706428 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706436 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706446 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706455 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706477 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" volumeName="kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706486 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706510 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706519 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7e2c886-118e-43bb-bef1-c78134de392b" volumeName="kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706528 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b638b8f4bb0070e40528db779baf6a2" volumeName="kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706535 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706543 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706567 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706575 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706611 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ee8fbd3-1f81-4666-96da-5afc70819f1a" volumeName="kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706659 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706682 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706693 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706718 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706727 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706738 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706752 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706760 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706772 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce090a97-9ab6-4c40-a719-64ff2acd9778" volumeName="kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706780 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706789 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706798 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706807 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706818 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706827 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706836 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706847 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706856 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3a14caf222afb62aaabdc47808b6f944" volumeName="kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706866 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706876 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706884 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706892 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" volumeName="kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706905 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706912 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706922 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0effdbcf-dd7d-404d-9d48-77536d665a5d" volumeName="kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706931 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ee8fbd3-1f81-4666-96da-5afc70819f1a" volumeName="kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706940 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706949 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706958 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20c5c5b4bed930554494851fe3cb2b2a" volumeName="kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.706975 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2325ffef-9d5b-447f-b00e-3efc429acefe" volumeName="kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707002 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707012 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707020 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707029 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707045 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707054 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707063 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707071 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707083 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707092 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707102 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707111 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" volumeName="kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707120 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a208c9c2-333b-4b4a-be0d-bc32ec38a821" volumeName="kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707133 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" volumeName="kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707142 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707152 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707161 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707213 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707226 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707236 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707257 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707268 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707276 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707287 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" volumeName="kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707296 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707307 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707316 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="428b39f5-eb1c-4f65-b7a4-eeb6e84860cc" volumeName="kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707326 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707335 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707345 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707355 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707366 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707375 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707385 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707401 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707411 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707420 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707430 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d7e8f42f-dc0e-424b-bb56-5ec849834888" volumeName="kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707439 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707448 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707461 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707471 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707480 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707490 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="92dfbade-90b6-4169-8c07-72cff7f2c82b" volumeName="kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.707500 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711315 5116 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1264ac67579ad07e7e9003054d44fe40dd55285a4b2f7dc74e48be1aee0868a/globalmount" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711393 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f0bc7fcb0822a2c13eb2d22cd8c0641" volumeName="kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711423 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b605f283-6f2e-42da-a838-54421690f7d0" volumeName="kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711438 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711453 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711467 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711479 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711490 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711501 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711514 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94a6e063-3d1a-4d44-875d-185291448c31" volumeName="kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711524 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711537 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="34177974-8d82-49d2-a763-391d0df3bbd8" volumeName="kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711549 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711562 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="18f80adb-c1c3-49ba-8ee4-932c851d3897" volumeName="kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711577 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711596 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711610 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc85e424-18b2-4924-920b-bd291a8c4b01" volumeName="kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711638 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711649 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711670 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5f2bfad-70f6-4185-a3d9-81ce12720767" volumeName="kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711682 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d565531a-ff86-4608-9d19-767de01ac31b" volumeName="kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711697 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711708 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f559dfa3-3917-43a2-97f6-61ddfda10e93" volumeName="kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711720 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711733 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711744 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711756 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a555ff2e-0be6-46d5-897d-863bb92ae2b3" volumeName="kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711769 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711781 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711793 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="593a3561-7760-45c5-8f91-5aaef7475d0f" volumeName="kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711804 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711816 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81e39f7b-62e4-4fc9-992a-6535ce127a02" volumeName="kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711829 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711841 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711855 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711868 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7afa918d-be67-40a6-803c-d3b0ae99d815" volumeName="kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711880 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711892 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9f71a554-e414-4bc3-96d2-674060397afe" volumeName="kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711902 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" volumeName="kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711914 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af41de71-79cf-4590-bbe9-9e8b848862cb" volumeName="kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711926 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="16bdd140-dce1-464c-ab47-dd5798d1d256" volumeName="kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711943 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711955 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7599e0b6-bddf-4def-b7f2-0b32206e8651" volumeName="kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711969 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="149b3c48-e17c-4a66-a835-d86dabf6ff13" volumeName="kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711983 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="301e1965-1754-483d-b6cc-bfae7038bbca" volumeName="kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.711995 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31fa8943-81cc-4750-a0b7-0fa9ab5af883" volumeName="kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712006 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b4750666-1362-4001-abd0-6f89964cc621" volumeName="kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712018 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712031 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" volumeName="kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712045 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712056 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01080b46-74f1-4191-8755-5152a57b3b25" volumeName="kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712070 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712098 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc4541ce-7789-4670-bc75-5c2868e52ce0" volumeName="kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712116 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712126 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712136 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712148 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712182 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712206 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="869851b9-7ffb-4af0-b166-1d8aa40a5f80" volumeName="kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712225 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a208c9c2-333b-4b4a-be0d-bc32ec38a821" volumeName="kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712235 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712245 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="584e1f4a-8205-47d7-8efb-3afc6017c4c9" volumeName="kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712256 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" volumeName="kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712267 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="567683bd-0efc-4f21-b076-e28559628404" volumeName="kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712276 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6077b63e-53a2-4f96-9d56-1ce0324e4913" volumeName="kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712286 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7df94c10-441d-4386-93a6-6730fb7bcde0" volumeName="kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712299 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09cfa50b-4138-4585-a53e-64dd3ab73335" volumeName="kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712309 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="42a11a02-47e1-488f-b270-2679d3298b0e" volumeName="kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712388 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5ebfebf6-3ecd-458e-943f-bb25b52e2718" volumeName="kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712400 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="736c54fe-349c-4bb9-870a-d1c1d1c03831" volumeName="kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712410 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" volumeName="kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712419 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9e9b5059-1b3e-4067-a63d-2952cbe863af" volumeName="kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712430 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" volumeName="kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712441 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="af33e427-6803-48c2-a76a-dd9deb7cbf9a" volumeName="kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712452 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d19cb085-0c5b-4810-b654-ce7923221d90" volumeName="kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712463 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" volumeName="kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712473 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" volumeName="kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712482 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" volumeName="kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712493 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712502 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c491984c-7d4b-44aa-8c1e-d7974424fa47" volumeName="kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712512 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" volumeName="kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712522 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6edfcf45-925b-4eff-b940-95b6fc0b85d4" volumeName="kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712532 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7a88189-c967-4640-879e-27665747f20c" volumeName="kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712542 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f65c0ac1-8bca-454d-a2e6-e35cb418beac" volumeName="kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712552 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f863fff9-286a-45fa-b8f0-8a86994b8440" volumeName="kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712563 5116 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" volumeName="kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls" seLinuxMountContext="" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712574 5116 reconstruct.go:97] "Volume reconstruction finished" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.712582 5116 reconciler.go:26] "Reconciler: start to sync state" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.719458 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.731570 5116 manager.go:341] "Starting Device Plugin manager" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.731828 5116 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.731850 5116 server.go:85] "Starting device plugin registration server" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.732281 5116 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.732301 5116 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.732482 5116 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.732587 5116 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.732602 5116 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.736533 5116 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.736601 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.797907 5116 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.798129 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.798777 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.798829 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.798845 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.799608 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.799768 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.799813 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800194 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800221 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800233 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800265 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800284 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.800294 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.801253 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.801325 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.801358 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802409 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802445 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802481 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802444 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802572 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.802586 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.803469 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.803600 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.803651 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804196 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804236 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804251 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804208 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804281 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.804295 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.805515 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.805547 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.805664 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.806149 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.806190 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.806203 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.808291 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.808339 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.808352 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.809057 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.809197 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.810500 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.810523 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.810535 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.819612 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.831290 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.832606 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.833415 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.833461 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.833474 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.833500 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.834043 5116 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.837608 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.859314 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.891640 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: E0322 00:08:49.897632 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.914856 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.914919 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.914957 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915598 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915645 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915675 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915704 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915738 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915767 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915798 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915833 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915879 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915924 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915955 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.915985 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-run-kubernetes\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916021 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916052 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916080 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916150 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916202 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916238 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916271 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.916305 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917185 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-run-kubernetes\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-var-run-kubernetes\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917216 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917297 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917763 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-tmp-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917936 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b638b8f4bb0070e40528db779baf6a2-tmp\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.917948 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-dir\" (UniqueName: \"kubernetes.io/empty-dir/9f0bc7fcb0822a2c13eb2d22cd8c0641-ca-trust-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:49 crc kubenswrapper[5116]: I0322 00:08:49.918308 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/20c5c5b4bed930554494851fe3cb2b2a-tmp-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.017820 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.017893 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.017916 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.017989 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-auto-backup-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-etcd-auto-backup-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018060 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-log-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018061 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-resource-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018132 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018270 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018321 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018365 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018404 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018445 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018489 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018532 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018575 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018614 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018666 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018708 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018760 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018941 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.018209 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019039 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019069 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019095 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-static-pod-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019118 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-usr-local-bin\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019151 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-data-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019194 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/9f0bc7fcb0822a2c13eb2d22cd8c0641-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"9f0bc7fcb0822a2c13eb2d22cd8c0641\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019219 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019241 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4e08c320b1e9e2405e6e0107bdf7eeb4-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"4e08c320b1e9e2405e6e0107bdf7eeb4\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019266 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/20c5c5b4bed930554494851fe3cb2b2a-cert-dir\") pod \"etcd-crc\" (UID: \"20c5c5b4bed930554494851fe3cb2b2a\") " pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019289 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.019353 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0b638b8f4bb0070e40528db779baf6a2-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"0b638b8f4bb0070e40528db779baf6a2\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.034916 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.036143 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.036266 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.036296 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.036352 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:50 crc kubenswrapper[5116]: E0322 00:08:50.036976 5116 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.132426 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.139760 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.160613 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: W0322 00:08:50.184262 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a14caf222afb62aaabdc47808b6f944.slice/crio-53687b947ea38ea8f6bbba0ae3084b15a74f7b9470d203e9c04de68b3a834955 WatchSource:0}: Error finding container 53687b947ea38ea8f6bbba0ae3084b15a74f7b9470d203e9c04de68b3a834955: Status 404 returned error can't find the container with id 53687b947ea38ea8f6bbba0ae3084b15a74f7b9470d203e9c04de68b3a834955 Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.188826 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:08:50 crc kubenswrapper[5116]: W0322 00:08:50.190301 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0bc7fcb0822a2c13eb2d22cd8c0641.slice/crio-377972f7d316901eb92f035066dcfd94a42b91254a72994084babc2138a26888 WatchSource:0}: Error finding container 377972f7d316901eb92f035066dcfd94a42b91254a72994084babc2138a26888: Status 404 returned error can't find the container with id 377972f7d316901eb92f035066dcfd94a42b91254a72994084babc2138a26888 Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.191993 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.198526 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 22 00:08:50 crc kubenswrapper[5116]: W0322 00:08:50.198660 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b638b8f4bb0070e40528db779baf6a2.slice/crio-b9dfc92772583ec2aaf8b8d9e25d76eeed90c5b85fc9509472a19a461be9b22c WatchSource:0}: Error finding container b9dfc92772583ec2aaf8b8d9e25d76eeed90c5b85fc9509472a19a461be9b22c: Status 404 returned error can't find the container with id b9dfc92772583ec2aaf8b8d9e25d76eeed90c5b85fc9509472a19a461be9b22c Mar 22 00:08:50 crc kubenswrapper[5116]: W0322 00:08:50.213099 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e08c320b1e9e2405e6e0107bdf7eeb4.slice/crio-30a37cba6f80744af79a725a6d8345a102fb2ace6d12b7898310394ec8d6b7e0 WatchSource:0}: Error finding container 30a37cba6f80744af79a725a6d8345a102fb2ace6d12b7898310394ec8d6b7e0: Status 404 returned error can't find the container with id 30a37cba6f80744af79a725a6d8345a102fb2ace6d12b7898310394ec8d6b7e0 Mar 22 00:08:50 crc kubenswrapper[5116]: W0322 00:08:50.217056 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20c5c5b4bed930554494851fe3cb2b2a.slice/crio-1e8e20269b5c698e1ce97359ff99cf27c22b371c2d789b4e8fa1f32a418dd0ee WatchSource:0}: Error finding container 1e8e20269b5c698e1ce97359ff99cf27c22b371c2d789b4e8fa1f32a418dd0ee: Status 404 returned error can't find the container with id 1e8e20269b5c698e1ce97359ff99cf27c22b371c2d789b4e8fa1f32a418dd0ee Mar 22 00:08:50 crc kubenswrapper[5116]: E0322 00:08:50.220736 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.437816 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.438682 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.438740 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.438756 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.438777 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:50 crc kubenswrapper[5116]: E0322 00:08:50.439107 5116 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.595834 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.701545 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"1e8e20269b5c698e1ce97359ff99cf27c22b371c2d789b4e8fa1f32a418dd0ee"} Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.704111 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerStarted","Data":"30a37cba6f80744af79a725a6d8345a102fb2ace6d12b7898310394ec8d6b7e0"} Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.705461 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"b9dfc92772583ec2aaf8b8d9e25d76eeed90c5b85fc9509472a19a461be9b22c"} Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.709070 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"377972f7d316901eb92f035066dcfd94a42b91254a72994084babc2138a26888"} Mar 22 00:08:50 crc kubenswrapper[5116]: I0322 00:08:50.710585 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"53687b947ea38ea8f6bbba0ae3084b15a74f7b9470d203e9c04de68b3a834955"} Mar 22 00:08:50 crc kubenswrapper[5116]: E0322 00:08:50.748194 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.021990 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.047298 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.112362 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.144247 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.240141 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.241789 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.241842 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.241860 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.241887 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.242366 5116 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.595295 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.600768 5116 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.601932 5116 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.714565 5116 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db" exitCode=0 Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.714624 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.714862 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.715567 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.715597 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.715607 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.715885 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.717134 5116 generic.go:358] "Generic (PLEG): container finished" podID="4e08c320b1e9e2405e6e0107bdf7eeb4" containerID="cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff" exitCode=0 Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.717178 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerDied","Data":"cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.717347 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.719040 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.719081 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.719104 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.719406 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.720455 5116 generic.go:358] "Generic (PLEG): container finished" podID="0b638b8f4bb0070e40528db779baf6a2" containerID="1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c" exitCode=0 Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.720527 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerDied","Data":"1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.720686 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.721578 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.721612 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.721621 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.721776 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.725207 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"497485a3b5940835bde037d81157c05ecaaa45b09bdffff76bbc038694f328d5"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.725275 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"2d70ad1aa005302b2a06b86cd4b8f4df7506345c8e11f68b51f077be50023120"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.725292 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.726663 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5" exitCode=0 Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.726694 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5"} Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.726821 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.727339 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.727384 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.727396 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:51 crc kubenswrapper[5116]: E0322 00:08:51.727595 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.730134 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.730727 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.730769 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:51 crc kubenswrapper[5116]: I0322 00:08:51.730782 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.595864 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.624029 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.719513 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.734976 5116 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de" exitCode=0 Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.735047 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.735359 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.736258 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.736288 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.736298 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.736500 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.739075 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"4e08c320b1e9e2405e6e0107bdf7eeb4","Type":"ContainerStarted","Data":"52601b921664be4e939b691ab6a7a52e7865f01815701526f277de4bf21520e4"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.739184 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.740697 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.740721 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.740802 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.741098 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.743287 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"609c703ea4eb165b2fdc69a88571190199d2dd374ea3629e50d7e6091c4552b5"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.743317 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"8c2b67442f55addc55a7f281f1b6a6441e1d6068e6f826dedb686cdc29ef2ec1"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.743328 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"0b638b8f4bb0070e40528db779baf6a2","Type":"ContainerStarted","Data":"5f2712849d12be66837cdb73e0858adfc9b9172ec8bbe1d83c7fe3fcc4bc8fe7"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.743373 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.744102 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.744241 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.744269 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.744674 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.746450 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"dac49ea656d097873f0dcd29b532de929dd705148709a0184938669a24b2d0b1"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.746544 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.747257 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.747288 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.747297 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.747460 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.753549 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.753582 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.753593 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.753605 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7"} Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.842949 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.843741 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.843775 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.843815 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:52 crc kubenswrapper[5116]: I0322 00:08:52.843874 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:52 crc kubenswrapper[5116]: E0322 00:08:52.844360 5116 kubelet_node_status.go:110] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.085524 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.756708 5116 generic.go:358] "Generic (PLEG): container finished" podID="20c5c5b4bed930554494851fe3cb2b2a" containerID="c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195" exitCode=0 Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.756792 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerDied","Data":"c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195"} Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.756941 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.757477 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.757519 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.757532 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.757703 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.761589 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"e2c881c727397234f9ea0049cb8b50cf8c351a23d8feaf545b93e1a981e673d6"} Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.763056 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.763891 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.763918 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.764612 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.764656 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766517 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766557 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766572 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766730 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766816 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766924 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.767012 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.767037 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.767047 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.766517 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.767077 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:53 crc kubenswrapper[5116]: I0322 00:08:53.767087 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.766946 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.767572 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.767804 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:53 crc kubenswrapper[5116]: E0322 00:08:53.768288 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.540939 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.766911 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767274 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767390 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"bfe93ca3eb874b172d7e5a437f62f0a3794ec8d1de985eadc78f9c5dc076e580"} Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767416 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"79bb9087b191778249a1a55111db0643e3b60d19f757f975df02d11b33c9c101"} Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767424 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"2600f5226a6713d12505ffd457e56ebd7d088161c72b9c8559bbea8a975297de"} Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767433 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"5c4690216bec52133687895e582b4bb77c5e98aa64b321ad9c9d506f55d3f00e"} Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767440 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"20c5c5b4bed930554494851fe3cb2b2a","Type":"ContainerStarted","Data":"88776b5f47aa00647e827ac8c99416f29940ce9d6589329425a92597ab26ec51"} Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767536 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767949 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767985 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.767998 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768153 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768212 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768224 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768223 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768257 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:54 crc kubenswrapper[5116]: I0322 00:08:54.768271 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:54 crc kubenswrapper[5116]: E0322 00:08:54.768279 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:54 crc kubenswrapper[5116]: E0322 00:08:54.768490 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:54 crc kubenswrapper[5116]: E0322 00:08:54.768821 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.112319 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.112540 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.113739 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.113790 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.113801 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:55 crc kubenswrapper[5116]: E0322 00:08:55.114117 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.705235 5116 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.769910 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.769910 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.770939 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.770990 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.771010 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.771025 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.771038 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:55 crc kubenswrapper[5116]: I0322 00:08:55.771048 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:55 crc kubenswrapper[5116]: E0322 00:08:55.771539 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:55 crc kubenswrapper[5116]: E0322 00:08:55.772133 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.044832 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.046122 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.046227 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.046244 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.046281 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.064351 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.064696 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.066074 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.066144 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:56 crc kubenswrapper[5116]: I0322 00:08:56.066187 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:56 crc kubenswrapper[5116]: E0322 00:08:56.066685 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.018413 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.018722 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.019674 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.019740 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.019760 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:57 crc kubenswrapper[5116]: E0322 00:08:57.020277 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.880981 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.881363 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.882546 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.882599 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:57 crc kubenswrapper[5116]: I0322 00:08:57.882612 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:57 crc kubenswrapper[5116]: E0322 00:08:57.883091 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.113574 5116 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": context deadline exceeded" start-of-body= Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.113674 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": context deadline exceeded" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.874506 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-etcd/etcd-crc" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.874744 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.875645 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.875753 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:08:58 crc kubenswrapper[5116]: I0322 00:08:58.875778 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:08:58 crc kubenswrapper[5116]: E0322 00:08:58.876703 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:08:59 crc kubenswrapper[5116]: E0322 00:08:59.737357 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:00 crc kubenswrapper[5116]: I0322 00:09:00.489593 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:00 crc kubenswrapper[5116]: I0322 00:09:00.489901 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:00 crc kubenswrapper[5116]: I0322 00:09:00.490815 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:00 crc kubenswrapper[5116]: I0322 00:09:00.490864 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:00 crc kubenswrapper[5116]: I0322 00:09:00.490880 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:00 crc kubenswrapper[5116]: E0322 00:09:00.491306 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.703755 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.704321 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.705233 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.705312 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.705333 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:01 crc kubenswrapper[5116]: E0322 00:09:01.705821 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.710959 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.788331 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.789220 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.789298 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.789324 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:01 crc kubenswrapper[5116]: E0322 00:09:01.789978 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.794075 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.853715 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.853992 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.854952 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.854986 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:01 crc kubenswrapper[5116]: I0322 00:09:01.854996 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:01 crc kubenswrapper[5116]: E0322 00:09:01.855348 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.790930 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.792108 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.792273 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.792307 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:02 crc kubenswrapper[5116]: E0322 00:09:02.793121 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.872240 5116 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 22 00:09:02 crc kubenswrapper[5116]: I0322 00:09:02.872318 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.547253 5116 trace.go:236] Trace[673956096]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Mar-2026 00:08:53.546) (total time: 10001ms): Mar 22 00:09:03 crc kubenswrapper[5116]: Trace[673956096]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (00:09:03.547) Mar 22 00:09:03 crc kubenswrapper[5116]: Trace[673956096]: [10.00103103s] [10.00103103s] END Mar 22 00:09:03 crc kubenswrapper[5116]: E0322 00:09:03.547319 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.596518 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.901251 5116 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.901318 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.906492 5116 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 22 00:09:03 crc kubenswrapper[5116]: I0322 00:09:03.906564 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 22 00:09:05 crc kubenswrapper[5116]: E0322 00:09:05.825548 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.028730 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.028920 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.029890 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.029917 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.029926 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:07 crc kubenswrapper[5116]: E0322 00:09:07.030204 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.035559 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.803764 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.804440 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.804474 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:07 crc kubenswrapper[5116]: I0322 00:09:07.804493 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:07 crc kubenswrapper[5116]: E0322 00:09:07.805061 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.114019 5116 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.114139 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.755763 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.899544 5116 trace.go:236] Trace[2115267700]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Mar-2026 00:08:56.443) (total time: 12455ms): Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[2115267700]: ---"Objects listed" error:csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope 12455ms (00:09:08.899) Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[2115267700]: [12.455651425s] [12.455651425s] END Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.899578 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.900643 5116 trace.go:236] Trace[2079725794]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Mar-2026 00:08:54.293) (total time: 14606ms): Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[2079725794]: ---"Objects listed" error:nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope 14606ms (00:09:08.900) Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[2079725794]: [14.606786418s] [14.606786418s] END Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.900680 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.900583 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa607e028 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.612529704 +0000 UTC m=+0.634831087,LastTimestamp:2026-03-22 00:08:49.612529704 +0000 UTC m=+0.634831087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.903276 5116 trace.go:236] Trace[885013279]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Mar-2026 00:08:56.783) (total time: 12119ms): Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[885013279]: ---"Objects listed" error:services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope 12119ms (00:09:08.903) Mar 22 00:09:08 crc kubenswrapper[5116]: Trace[885013279]: [12.119439895s] [12.119439895s] END Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.903308 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.904189 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.904250 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.906505 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.908518 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: I0322 00:09:08.909986 5116 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.911047 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aad4f781c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.734662172 +0000 UTC m=+0.756963545,LastTimestamp:2026-03-22 00:08:49.734662172 +0000 UTC m=+0.756963545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.913464 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.798809755 +0000 UTC m=+0.821111128,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.915703 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.798837721 +0000 UTC m=+0.821139094,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.918181 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.798852044 +0000 UTC m=+0.821153417,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.920759 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.800211118 +0000 UTC m=+0.822512491,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.926614 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.800227142 +0000 UTC m=+0.822528515,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.928862 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.800239674 +0000 UTC m=+0.822541047,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.934305 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.800277022 +0000 UTC m=+0.822578395,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.941865 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.800289224 +0000 UTC m=+0.822590597,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.946093 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.800299876 +0000 UTC m=+0.822601249,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.950892 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.802425547 +0000 UTC m=+0.824726920,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.955538 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.802476998 +0000 UTC m=+0.824778371,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.960437 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.80248712 +0000 UTC m=+0.824788493,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.965577 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.802544762 +0000 UTC m=+0.824846135,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.970936 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.802580859 +0000 UTC m=+0.824882242,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.976214 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.802593171 +0000 UTC m=+0.824894544,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.981450 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.804228082 +0000 UTC m=+0.826529455,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.986083 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.804243305 +0000 UTC m=+0.826544688,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.991115 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ef810\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ef810 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671714832 +0000 UTC m=+0.694016205,LastTimestamp:2026-03-22 00:08:49.804257558 +0000 UTC m=+0.826558931,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.995342 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98e7a87\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98e7a87 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.671682695 +0000 UTC m=+0.693984068,LastTimestamp:2026-03-22 00:08:49.804270421 +0000 UTC m=+0.826571794,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:08 crc kubenswrapper[5116]: E0322 00:09:08.998873 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189f013aa98ecffa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189f013aa98ecffa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:49.67170457 +0000 UTC m=+0.694005943,LastTimestamp:2026-03-22 00:08:49.804290034 +0000 UTC m=+0.826591408,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.003557 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013ac8670d06 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.189192454 +0000 UTC m=+1.211493837,LastTimestamp:2026-03-22 00:08:50.189192454 +0000 UTC m=+1.211493837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.007001 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013ac8e58c67 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.197482599 +0000 UTC m=+1.219783972,LastTimestamp:2026-03-22 00:08:50.197482599 +0000 UTC m=+1.219783972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.012335 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013ac9629a14 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.2056781 +0000 UTC m=+1.227979473,LastTimestamp:2026-03-22 00:08:50.2056781 +0000 UTC m=+1.227979473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.020635 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013aca1701c7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.217501127 +0000 UTC m=+1.239802520,LastTimestamp:2026-03-22 00:08:50.217501127 +0000 UTC m=+1.239802520,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.025596 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013aca4e627c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.221130364 +0000 UTC m=+1.243431737,LastTimestamp:2026-03-22 00:08:50.221130364 +0000 UTC m=+1.243431737,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.030012 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013aefe1f70a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.851559178 +0000 UTC m=+1.873860571,LastTimestamp:2026-03-22 00:08:50.851559178 +0000 UTC m=+1.873860571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.036300 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013aefe36e0d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.851655181 +0000 UTC m=+1.873956564,LastTimestamp:2026-03-22 00:08:50.851655181 +0000 UTC m=+1.873956564,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.040920 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013aefe5f8d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.851821777 +0000 UTC m=+1.874123150,LastTimestamp:2026-03-22 00:08:50.851821777 +0000 UTC m=+1.874123150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.045464 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013aefe75d14 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container: wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.85191298 +0000 UTC m=+1.874214353,LastTimestamp:2026-03-22 00:08:50.85191298 +0000 UTC m=+1.874214353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.050427 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013aefe87d36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.851986742 +0000 UTC m=+1.874288135,LastTimestamp:2026-03-22 00:08:50.851986742 +0000 UTC m=+1.874288135,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.054688 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013af0b5c250 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.865439312 +0000 UTC m=+1.887740695,LastTimestamp:2026-03-22 00:08:50.865439312 +0000 UTC m=+1.887740695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.058876 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013af0bf7732 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.866075442 +0000 UTC m=+1.888376835,LastTimestamp:2026-03-22 00:08:50.866075442 +0000 UTC m=+1.888376835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.066521 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013af0c08bcf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.866146255 +0000 UTC m=+1.888447628,LastTimestamp:2026-03-22 00:08:50.866146255 +0000 UTC m=+1.888447628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.071021 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013af0c18fdb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.866212827 +0000 UTC m=+1.888514200,LastTimestamp:2026-03-22 00:08:50.866212827 +0000 UTC m=+1.888514200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.075766 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013af0c665d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.866529747 +0000 UTC m=+1.888831130,LastTimestamp:2026-03-22 00:08:50.866529747 +0000 UTC m=+1.888831130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.079974 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013af0dc9a75 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:50.867985013 +0000 UTC m=+1.890286386,LastTimestamp:2026-03-22 00:08:50.867985013 +0000 UTC m=+1.890286386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.085311 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b00b301b1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.133694385 +0000 UTC m=+2.155995758,LastTimestamp:2026-03-22 00:08:51.133694385 +0000 UTC m=+2.155995758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.090571 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b014f9a50 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.143957072 +0000 UTC m=+2.166258445,LastTimestamp:2026-03-22 00:08:51.143957072 +0000 UTC m=+2.166258445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.095611 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b015ea35f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.144942431 +0000 UTC m=+2.167243814,LastTimestamp:2026-03-22 00:08:51.144942431 +0000 UTC m=+2.167243814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.102026 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b1b045f6a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container: kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.57523441 +0000 UTC m=+2.597535783,LastTimestamp:2026-03-22 00:08:51.57523441 +0000 UTC m=+2.597535783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.109301 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b1c14273e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.593045822 +0000 UTC m=+2.615347195,LastTimestamp:2026-03-22 00:08:51.593045822 +0000 UTC m=+2.615347195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.110693 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b1c24a778 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.594127224 +0000 UTC m=+2.616428597,LastTimestamp:2026-03-22 00:08:51.594127224 +0000 UTC m=+2.616428597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.116067 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b237610ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.716903118 +0000 UTC m=+2.739204491,LastTimestamp:2026-03-22 00:08:51.716903118 +0000 UTC m=+2.739204491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.123715 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013b23afd054 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.7206877 +0000 UTC m=+2.742989073,LastTimestamp:2026-03-22 00:08:51.7206877 +0000 UTC m=+2.742989073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.128474 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b240627ff openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.726346239 +0000 UTC m=+2.748647612,LastTimestamp:2026-03-22 00:08:51.726346239 +0000 UTC m=+2.748647612,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.133013 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b243e3ccb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.730021579 +0000 UTC m=+2.752322952,LastTimestamp:2026-03-22 00:08:51.730021579 +0000 UTC m=+2.752322952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.137679 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b29643338 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container: kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.816395576 +0000 UTC m=+2.838696949,LastTimestamp:2026-03-22 00:08:51.816395576 +0000 UTC m=+2.838696949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.142824 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013b2b476fad openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.848064941 +0000 UTC m=+2.870366314,LastTimestamp:2026-03-22 00:08:51.848064941 +0000 UTC m=+2.870366314,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.147349 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b2df8d839 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container: etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.893246009 +0000 UTC m=+2.915547382,LastTimestamp:2026-03-22 00:08:51.893246009 +0000 UTC m=+2.915547382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.154415 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b2f7a7013 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.918516243 +0000 UTC m=+2.940817616,LastTimestamp:2026-03-22 00:08:51.918516243 +0000 UTC m=+2.940817616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.160222 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b2fbd18e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.922884833 +0000 UTC m=+2.945186206,LastTimestamp:2026-03-22 00:08:51.922884833 +0000 UTC m=+2.945186206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.165465 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b30a93fe4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.938361316 +0000 UTC m=+2.960662679,LastTimestamp:2026-03-22 00:08:51.938361316 +0000 UTC m=+2.960662679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.172617 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b30b92c1a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.939404826 +0000 UTC m=+2.961706199,LastTimestamp:2026-03-22 00:08:51.939404826 +0000 UTC m=+2.961706199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.177598 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013b34285a10 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:51.997022736 +0000 UTC m=+3.019324109,LastTimestamp:2026-03-22 00:08:51.997022736 +0000 UTC m=+3.019324109,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.186231 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b35479ca9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.015848617 +0000 UTC m=+3.038149990,LastTimestamp:2026-03-22 00:08:52.015848617 +0000 UTC m=+3.038149990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.192237 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189f013b355d600a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:4e08c320b1e9e2405e6e0107bdf7eeb4,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.01727489 +0000 UTC m=+3.039576263,LastTimestamp:2026-03-22 00:08:52.01727489 +0000 UTC m=+3.039576263,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.198322 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b3613fab5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.029242037 +0000 UTC m=+3.051543410,LastTimestamp:2026-03-22 00:08:52.029242037 +0000 UTC m=+3.051543410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.206970 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b36977ef0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.037861104 +0000 UTC m=+3.060162477,LastTimestamp:2026-03-22 00:08:52.037861104 +0000 UTC m=+3.060162477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.211643 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b3dd7f4cf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container: kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.159526095 +0000 UTC m=+3.181827468,LastTimestamp:2026-03-22 00:08:52.159526095 +0000 UTC m=+3.181827468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.213067 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b3ecceded openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.175580653 +0000 UTC m=+3.197882026,LastTimestamp:2026-03-22 00:08:52.175580653 +0000 UTC m=+3.197882026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.216548 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b3edd5e2e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.176657966 +0000 UTC m=+3.198959339,LastTimestamp:2026-03-22 00:08:52.176657966 +0000 UTC m=+3.198959339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.217732 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b42e74eef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container: kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.244418287 +0000 UTC m=+3.266719660,LastTimestamp:2026-03-22 00:08:52.244418287 +0000 UTC m=+3.266719660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.224635 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b43e232a3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.260860579 +0000 UTC m=+3.283161952,LastTimestamp:2026-03-22 00:08:52.260860579 +0000 UTC m=+3.283161952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.230301 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b43ef7c10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.261731344 +0000 UTC m=+3.284032717,LastTimestamp:2026-03-22 00:08:52.261731344 +0000 UTC m=+3.284032717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.239740 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b4d67adc6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container: kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.420603334 +0000 UTC m=+3.442904707,LastTimestamp:2026-03-22 00:08:52.420603334 +0000 UTC m=+3.442904707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.245011 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189f013b4e0091a7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:0b638b8f4bb0070e40528db779baf6a2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.430623143 +0000 UTC m=+3.452924516,LastTimestamp:2026-03-22 00:08:52.430623143 +0000 UTC m=+3.452924516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.249741 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b50cce364 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container: kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.477567844 +0000 UTC m=+3.499869217,LastTimestamp:2026-03-22 00:08:52.477567844 +0000 UTC m=+3.499869217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.254813 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b51de25bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.495476159 +0000 UTC m=+3.517777522,LastTimestamp:2026-03-22 00:08:52.495476159 +0000 UTC m=+3.517777522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.259642 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b51ee6e6c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.49654334 +0000 UTC m=+3.518844713,LastTimestamp:2026-03-22 00:08:52.49654334 +0000 UTC m=+3.518844713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.264351 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b5d7976e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.690204389 +0000 UTC m=+3.712505762,LastTimestamp:2026-03-22 00:08:52.690204389 +0000 UTC m=+3.712505762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.269254 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b5e7e9435 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.707316789 +0000 UTC m=+3.729618162,LastTimestamp:2026-03-22 00:08:52.707316789 +0000 UTC m=+3.729618162,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.274203 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b5e8dae9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.708306589 +0000 UTC m=+3.730607962,LastTimestamp:2026-03-22 00:08:52.708306589 +0000 UTC m=+3.730607962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.279054 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b605f8bc0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.73883744 +0000 UTC m=+3.761138813,LastTimestamp:2026-03-22 00:08:52.73883744 +0000 UTC m=+3.761138813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.285083 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6e019740 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container: kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.967561024 +0000 UTC m=+3.989862397,LastTimestamp:2026-03-22 00:08:52.967561024 +0000 UTC m=+3.989862397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.289515 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b6e07b227 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container: etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.967961127 +0000 UTC m=+3.990262500,LastTimestamp:2026-03-22 00:08:52.967961127 +0000 UTC m=+3.990262500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.293356 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6f3968d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.987996374 +0000 UTC m=+4.010297747,LastTimestamp:2026-03-22 00:08:52.987996374 +0000 UTC m=+4.010297747,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.297276 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b6f5d5c65 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.990352485 +0000 UTC m=+4.012653858,LastTimestamp:2026-03-22 00:08:52.990352485 +0000 UTC m=+4.012653858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.300918 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013b9d2aa445 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:53.758780485 +0000 UTC m=+4.781081858,LastTimestamp:2026-03-22 00:08:53.758780485 +0000 UTC m=+4.781081858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.305310 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013baa563f20 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:53.979741984 +0000 UTC m=+5.002043357,LastTimestamp:2026-03-22 00:08:53.979741984 +0000 UTC m=+5.002043357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.309338 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013baae032f2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:53.988782834 +0000 UTC m=+5.011084207,LastTimestamp:2026-03-22 00:08:53.988782834 +0000 UTC m=+5.011084207,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.314045 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013baaecf1c7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:53.989618119 +0000 UTC m=+5.011919492,LastTimestamp:2026-03-22 00:08:53.989618119 +0000 UTC m=+5.011919492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.317620 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bb5a1dc16 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.169246742 +0000 UTC m=+5.191548115,LastTimestamp:2026-03-22 00:08:54.169246742 +0000 UTC m=+5.191548115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.321591 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bb6a04e4c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.185922124 +0000 UTC m=+5.208223507,LastTimestamp:2026-03-22 00:08:54.185922124 +0000 UTC m=+5.208223507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.325678 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bb6b5308b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.187290763 +0000 UTC m=+5.209592126,LastTimestamp:2026-03-22 00:08:54.187290763 +0000 UTC m=+5.209592126,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.329659 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bc2888400 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container: etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.3856896 +0000 UTC m=+5.407990973,LastTimestamp:2026-03-22 00:08:54.3856896 +0000 UTC m=+5.407990973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.333595 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bc344cf4e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.398029646 +0000 UTC m=+5.420331019,LastTimestamp:2026-03-22 00:08:54.398029646 +0000 UTC m=+5.420331019,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.337597 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bc3549ab4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.399064756 +0000 UTC m=+5.421366149,LastTimestamp:2026-03-22 00:08:54.399064756 +0000 UTC m=+5.421366149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.340985 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bce189c5d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container: etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.579682397 +0000 UTC m=+5.601983770,LastTimestamp:2026-03-22 00:08:54.579682397 +0000 UTC m=+5.601983770,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.344588 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bcecd9341 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.591542081 +0000 UTC m=+5.613843444,LastTimestamp:2026-03-22 00:08:54.591542081 +0000 UTC m=+5.613843444,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.348696 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bcedf89d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.592719315 +0000 UTC m=+5.615020688,LastTimestamp:2026-03-22 00:08:54.592719315 +0000 UTC m=+5.615020688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.353856 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bd84b3192 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container: etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.750769554 +0000 UTC m=+5.773070927,LastTimestamp:2026-03-22 00:08:54.750769554 +0000 UTC m=+5.773070927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.357983 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189f013bd9080982 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:20c5c5b4bed930554494851fe3cb2b2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:54.763145602 +0000 UTC m=+5.785446975,LastTimestamp:2026-03-22 00:08:54.763145602 +0000 UTC m=+5.785446975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.365793 5116 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57226->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.365849 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57226->192.168.126.11:17697: read: connection reset by peer" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.366129 5116 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.366153 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.366272 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-controller-manager-crc.189f013ca0bc9e2c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://localhost:10357/healthz": context deadline exceeded Mar 22 00:09:09 crc kubenswrapper[5116]: body: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:58.113646124 +0000 UTC m=+9.135947487,LastTimestamp:2026-03-22 00:08:58.113646124 +0000 UTC m=+9.135947487,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.370737 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013ca0bdad78 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://localhost:10357/healthz\": context deadline exceeded,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:58.113715576 +0000 UTC m=+9.136016949,LastTimestamp:2026-03-22 00:08:58.113715576 +0000 UTC m=+9.136016949,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.377248 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-apiserver-crc.189f013dbc5fc9e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 22 00:09:09 crc kubenswrapper[5116]: body: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:02.872291816 +0000 UTC m=+13.894593199,LastTimestamp:2026-03-22 00:09:02.872291816 +0000 UTC m=+13.894593199,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.382324 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013dbc609578 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:02.872343928 +0000 UTC m=+13.894645311,LastTimestamp:2026-03-22 00:09:02.872343928 +0000 UTC m=+13.894645311,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.387306 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-apiserver-crc.189f013df9b5205d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 22 00:09:09 crc kubenswrapper[5116]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 22 00:09:09 crc kubenswrapper[5116]: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:03.901294685 +0000 UTC m=+14.923596058,LastTimestamp:2026-03-22 00:09:03.901294685 +0000 UTC m=+14.923596058,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.391665 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013df9b5dd32 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:03.901343026 +0000 UTC m=+14.923644399,LastTimestamp:2026-03-22 00:09:03.901343026 +0000 UTC m=+14.923644399,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.395720 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013df9b5205d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-apiserver-crc.189f013df9b5205d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 22 00:09:09 crc kubenswrapper[5116]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 22 00:09:09 crc kubenswrapper[5116]: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:03.901294685 +0000 UTC m=+14.923596058,LastTimestamp:2026-03-22 00:09:03.90654013 +0000 UTC m=+14.928841503,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.400013 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013df9b5dd32\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013df9b5dd32 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:03.901343026 +0000 UTC m=+14.923644399,LastTimestamp:2026-03-22 00:09:03.906588041 +0000 UTC m=+14.928889414,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.405046 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-controller-manager-crc.189f013ef4cf42a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://localhost:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 22 00:09:09 crc kubenswrapper[5116]: body: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:08.114088617 +0000 UTC m=+19.136389990,LastTimestamp:2026-03-22 00:09:08.114088617 +0000 UTC m=+19.136389990,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.409551 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189f013ef4d0c60f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:9f0bc7fcb0822a2c13eb2d22cd8c0641,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:08.114187791 +0000 UTC m=+19.136489154,LastTimestamp:2026-03-22 00:09:08.114187791 +0000 UTC m=+19.136489154,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.415340 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-apiserver-crc.189f013f3f6b3d2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:57226->192.168.126.11:17697: read: connection reset by peer Mar 22 00:09:09 crc kubenswrapper[5116]: body: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:09.365824814 +0000 UTC m=+20.388126187,LastTimestamp:2026-03-22 00:09:09.365824814 +0000 UTC m=+20.388126187,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.420518 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f3f6c9857 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57226->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:09.365913687 +0000 UTC m=+20.388215060,LastTimestamp:2026-03-22 00:09:09.365913687 +0000 UTC m=+20.388215060,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.425393 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 22 00:09:09 crc kubenswrapper[5116]: &Event{ObjectMeta:{kube-apiserver-crc.189f013f3f7019f2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 22 00:09:09 crc kubenswrapper[5116]: body: Mar 22 00:09:09 crc kubenswrapper[5116]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:09.366143474 +0000 UTC m=+20.388444857,LastTimestamp:2026-03-22 00:09:09.366143474 +0000 UTC m=+20.388444857,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 22 00:09:09 crc kubenswrapper[5116]: > Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.430114 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f3f70ecda openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:09.366197466 +0000 UTC m=+20.388498849,LastTimestamp:2026-03-22 00:09:09.366197466 +0000 UTC m=+20.388498849,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.600919 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.737630 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.809192 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/0.log" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.811365 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="e2c881c727397234f9ea0049cb8b50cf8c351a23d8feaf545b93e1a981e673d6" exitCode=255 Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.811471 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"e2c881c727397234f9ea0049cb8b50cf8c351a23d8feaf545b93e1a981e673d6"} Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.811825 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.812638 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.812688 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.812705 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.813109 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:09 crc kubenswrapper[5116]: I0322 00:09:09.813404 5116 scope.go:117] "RemoveContainer" containerID="e2c881c727397234f9ea0049cb8b50cf8c351a23d8feaf545b93e1a981e673d6" Mar 22 00:09:09 crc kubenswrapper[5116]: E0322 00:09:09.823987 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b5e8dae9d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b5e8dae9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.708306589 +0000 UTC m=+3.730607962,LastTimestamp:2026-03-22 00:09:09.814470695 +0000 UTC m=+20.836772068,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:10 crc kubenswrapper[5116]: E0322 00:09:10.062911 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b6e019740\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6e019740 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container: kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.967561024 +0000 UTC m=+3.989862397,LastTimestamp:2026-03-22 00:09:10.056907501 +0000 UTC m=+21.079208874,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:10 crc kubenswrapper[5116]: E0322 00:09:10.072068 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b6f3968d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6f3968d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.987996374 +0000 UTC m=+4.010297747,LastTimestamp:2026-03-22 00:09:10.066933437 +0000 UTC m=+21.089234810,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.600076 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.815349 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.815907 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/0.log" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.817207 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" exitCode=255 Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.817252 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba"} Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.817287 5116 scope.go:117] "RemoveContainer" containerID="e2c881c727397234f9ea0049cb8b50cf8c351a23d8feaf545b93e1a981e673d6" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.817474 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.818008 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.818039 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.818052 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:10 crc kubenswrapper[5116]: E0322 00:09:10.818432 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:10 crc kubenswrapper[5116]: I0322 00:09:10.818682 5116 scope.go:117] "RemoveContainer" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" Mar 22 00:09:10 crc kubenswrapper[5116]: E0322 00:09:10.818932 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:10 crc kubenswrapper[5116]: E0322 00:09:10.824653 5116 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.600147 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.821724 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.878648 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.879226 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.880069 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.880122 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.880136 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:11 crc kubenswrapper[5116]: E0322 00:09:11.880675 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:11 crc kubenswrapper[5116]: I0322 00:09:11.894518 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 22 00:09:12 crc kubenswrapper[5116]: E0322 00:09:12.230836 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.599792 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.826263 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.826865 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.826905 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.826917 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:12 crc kubenswrapper[5116]: E0322 00:09:12.827293 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.871040 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.871605 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.872375 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.872403 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.872414 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:12 crc kubenswrapper[5116]: E0322 00:09:12.872771 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:12 crc kubenswrapper[5116]: I0322 00:09:12.873025 5116 scope.go:117] "RemoveContainer" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" Mar 22 00:09:12 crc kubenswrapper[5116]: E0322 00:09:12.873246 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:12 crc kubenswrapper[5116]: E0322 00:09:12.877397 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013f9607171e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:12.873214748 +0000 UTC m=+23.895516121,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:13 crc kubenswrapper[5116]: I0322 00:09:13.599652 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:14 crc kubenswrapper[5116]: E0322 00:09:14.570912 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:09:14 crc kubenswrapper[5116]: I0322 00:09:14.602691 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.121112 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.121424 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.122396 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.122437 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.122450 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:15 crc kubenswrapper[5116]: E0322 00:09:15.122870 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.128566 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.304297 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.305831 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.305955 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.305979 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.306020 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:15 crc kubenswrapper[5116]: E0322 00:09:15.321944 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.599941 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.832370 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.833022 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.833083 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:15 crc kubenswrapper[5116]: I0322 00:09:15.833095 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:15 crc kubenswrapper[5116]: E0322 00:09:15.833424 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:16 crc kubenswrapper[5116]: E0322 00:09:16.073778 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:09:16 crc kubenswrapper[5116]: E0322 00:09:16.553142 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:09:16 crc kubenswrapper[5116]: I0322 00:09:16.602564 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:16 crc kubenswrapper[5116]: E0322 00:09:16.740598 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:09:17 crc kubenswrapper[5116]: I0322 00:09:17.602374 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:18 crc kubenswrapper[5116]: I0322 00:09:18.600572 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:19 crc kubenswrapper[5116]: E0322 00:09:19.235625 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.366679 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.366922 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.367725 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.367837 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.367924 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:19 crc kubenswrapper[5116]: E0322 00:09:19.368440 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.368781 5116 scope.go:117] "RemoveContainer" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" Mar 22 00:09:19 crc kubenswrapper[5116]: E0322 00:09:19.369033 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:19 crc kubenswrapper[5116]: E0322 00:09:19.373452 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013f9607171e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:19.369003969 +0000 UTC m=+30.391305342,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:19 crc kubenswrapper[5116]: I0322 00:09:19.604891 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:19 crc kubenswrapper[5116]: E0322 00:09:19.737981 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:20 crc kubenswrapper[5116]: I0322 00:09:20.598232 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:21 crc kubenswrapper[5116]: I0322 00:09:21.598722 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.322103 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.323149 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.323271 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.323299 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.323335 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:22 crc kubenswrapper[5116]: E0322 00:09:22.336574 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:22 crc kubenswrapper[5116]: I0322 00:09:22.598736 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:23 crc kubenswrapper[5116]: I0322 00:09:23.596261 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:24 crc kubenswrapper[5116]: I0322 00:09:24.603737 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:25 crc kubenswrapper[5116]: E0322 00:09:25.376668 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:09:25 crc kubenswrapper[5116]: I0322 00:09:25.599537 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:26 crc kubenswrapper[5116]: E0322 00:09:26.241828 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:26 crc kubenswrapper[5116]: I0322 00:09:26.602971 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:27 crc kubenswrapper[5116]: I0322 00:09:27.600813 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:28 crc kubenswrapper[5116]: I0322 00:09:28.600364 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.337483 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.339710 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.339782 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.339804 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.339843 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:29 crc kubenswrapper[5116]: E0322 00:09:29.352881 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:29 crc kubenswrapper[5116]: I0322 00:09:29.600711 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:29 crc kubenswrapper[5116]: E0322 00:09:29.738831 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:30 crc kubenswrapper[5116]: E0322 00:09:30.208105 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 22 00:09:30 crc kubenswrapper[5116]: I0322 00:09:30.600371 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:31 crc kubenswrapper[5116]: I0322 00:09:31.601277 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:32 crc kubenswrapper[5116]: E0322 00:09:32.365276 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 22 00:09:32 crc kubenswrapper[5116]: I0322 00:09:32.603385 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:32 crc kubenswrapper[5116]: E0322 00:09:32.993563 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 22 00:09:33 crc kubenswrapper[5116]: E0322 00:09:33.248415 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.600059 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.697692 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.699438 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.699503 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.699519 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:33 crc kubenswrapper[5116]: E0322 00:09:33.700065 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:33 crc kubenswrapper[5116]: I0322 00:09:33.700448 5116 scope.go:117] "RemoveContainer" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" Mar 22 00:09:33 crc kubenswrapper[5116]: E0322 00:09:33.708653 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b5e8dae9d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b5e8dae9d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.708306589 +0000 UTC m=+3.730607962,LastTimestamp:2026-03-22 00:09:33.702387974 +0000 UTC m=+44.724689377,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:33 crc kubenswrapper[5116]: E0322 00:09:33.921919 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b6e019740\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6e019740 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container: kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.967561024 +0000 UTC m=+3.989862397,LastTimestamp:2026-03-22 00:09:33.917243181 +0000 UTC m=+44.939544554,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:33 crc kubenswrapper[5116]: E0322 00:09:33.937638 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013b6f3968d6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013b6f3968d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:08:52.987996374 +0000 UTC m=+4.010297747,LastTimestamp:2026-03-22 00:09:33.931114168 +0000 UTC m=+44.953415561,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.604253 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.880966 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.883539 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454"} Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.883830 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.895430 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.895501 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:34 crc kubenswrapper[5116]: I0322 00:09:34.895527 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:34 crc kubenswrapper[5116]: E0322 00:09:34.900736 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.602710 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.888267 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.888782 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/1.log" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.890509 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" exitCode=255 Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.890578 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454"} Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.890636 5116 scope.go:117] "RemoveContainer" containerID="ae6177d08f823bd2a233290158c1f293be39581f2fbec2469c518a66465fbdba" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.890841 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.891405 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.891440 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.891453 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:35 crc kubenswrapper[5116]: E0322 00:09:35.891783 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:35 crc kubenswrapper[5116]: I0322 00:09:35.892051 5116 scope.go:117] "RemoveContainer" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" Mar 22 00:09:35 crc kubenswrapper[5116]: E0322 00:09:35.896196 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:35 crc kubenswrapper[5116]: E0322 00:09:35.901571 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013f9607171e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:35.896138121 +0000 UTC m=+46.918439494,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.353234 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.354187 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.354255 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.354271 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.354296 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:36 crc kubenswrapper[5116]: E0322 00:09:36.367962 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.599496 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:36 crc kubenswrapper[5116]: I0322 00:09:36.895129 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 22 00:09:37 crc kubenswrapper[5116]: I0322 00:09:37.599536 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:38 crc kubenswrapper[5116]: I0322 00:09:38.603417 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:39 crc kubenswrapper[5116]: I0322 00:09:39.600472 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:39 crc kubenswrapper[5116]: E0322 00:09:39.739316 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:40 crc kubenswrapper[5116]: E0322 00:09:40.254713 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:40 crc kubenswrapper[5116]: I0322 00:09:40.600325 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:41 crc kubenswrapper[5116]: I0322 00:09:41.603082 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.600759 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.871981 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.872249 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.873080 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.873234 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.873264 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:42 crc kubenswrapper[5116]: E0322 00:09:42.873848 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:42 crc kubenswrapper[5116]: I0322 00:09:42.874280 5116 scope.go:117] "RemoveContainer" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" Mar 22 00:09:42 crc kubenswrapper[5116]: E0322 00:09:42.874670 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:42 crc kubenswrapper[5116]: E0322 00:09:42.880089 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013f9607171e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:42.874607975 +0000 UTC m=+53.896909378,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:43 crc kubenswrapper[5116]: E0322 00:09:43.065508 5116 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.368989 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.370030 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.370089 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.370113 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.370154 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:43 crc kubenswrapper[5116]: E0322 00:09:43.377487 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:43 crc kubenswrapper[5116]: I0322 00:09:43.602859 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.601081 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.773759 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.774081 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.775161 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.775270 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.775290 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:44 crc kubenswrapper[5116]: E0322 00:09:44.775814 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.884223 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.884545 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.885670 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.885733 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.885753 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:44 crc kubenswrapper[5116]: E0322 00:09:44.886592 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:44 crc kubenswrapper[5116]: I0322 00:09:44.887104 5116 scope.go:117] "RemoveContainer" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" Mar 22 00:09:44 crc kubenswrapper[5116]: E0322 00:09:44.887549 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:09:44 crc kubenswrapper[5116]: E0322 00:09:44.893285 5116 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189f013f9607171e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189f013f9607171e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:3a14caf222afb62aaabdc47808b6f944,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:BackOff,Message:Back-off restarting failed container kube-apiserver-check-endpoints in pod kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:09:10.818879262 +0000 UTC m=+21.841180635,LastTimestamp:2026-03-22 00:09:44.887481395 +0000 UTC m=+55.909782808,Count:6,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:09:45 crc kubenswrapper[5116]: I0322 00:09:45.602757 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:46 crc kubenswrapper[5116]: I0322 00:09:46.603160 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:47 crc kubenswrapper[5116]: E0322 00:09:47.261531 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:47 crc kubenswrapper[5116]: I0322 00:09:47.599569 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:48 crc kubenswrapper[5116]: I0322 00:09:48.602938 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:49 crc kubenswrapper[5116]: I0322 00:09:49.601294 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:49 crc kubenswrapper[5116]: E0322 00:09:49.740478 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.377860 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.379141 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.379253 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.379280 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.379317 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:50 crc kubenswrapper[5116]: E0322 00:09:50.396946 5116 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 22 00:09:50 crc kubenswrapper[5116]: I0322 00:09:50.599929 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:51 crc kubenswrapper[5116]: I0322 00:09:51.600666 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:52 crc kubenswrapper[5116]: I0322 00:09:52.601383 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:53 crc kubenswrapper[5116]: I0322 00:09:53.604269 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:54 crc kubenswrapper[5116]: E0322 00:09:54.269633 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 22 00:09:54 crc kubenswrapper[5116]: I0322 00:09:54.598692 5116 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 22 00:09:55 crc kubenswrapper[5116]: I0322 00:09:55.007098 5116 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jq45n" Mar 22 00:09:55 crc kubenswrapper[5116]: I0322 00:09:55.019596 5116 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jq45n" Mar 22 00:09:55 crc kubenswrapper[5116]: I0322 00:09:55.103729 5116 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 22 00:09:55 crc kubenswrapper[5116]: I0322 00:09:55.447339 5116 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 22 00:09:56 crc kubenswrapper[5116]: I0322 00:09:56.021360 5116 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2026-04-21 00:04:55 +0000 UTC" deadline="2026-04-17 17:49:21.420122729 +0000 UTC" Mar 22 00:09:56 crc kubenswrapper[5116]: I0322 00:09:56.021423 5116 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="641h39m25.398705341s" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.398152 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.399336 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.399470 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.399556 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.399757 5116 kubelet_node_status.go:78] "Attempting to register node" node="crc" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.407112 5116 kubelet_node_status.go:127] "Node was previously registered" node="crc" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.407523 5116 kubelet_node_status.go:81] "Successfully registered node" node="crc" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.407617 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.410517 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.410558 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.410571 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.410589 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.410604 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:09:57Z","lastTransitionTime":"2026-03-22T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.430804 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.438489 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.438553 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.438573 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.438594 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.438609 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:09:57Z","lastTransitionTime":"2026-03-22T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.448957 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.456560 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.456608 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.456626 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.456641 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.456652 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:09:57Z","lastTransitionTime":"2026-03-22T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.465834 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.472125 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.472151 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.472160 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.472191 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:09:57 crc kubenswrapper[5116]: I0322 00:09:57.472204 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:09:57Z","lastTransitionTime":"2026-03-22T00:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.480488 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.480802 5116 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.480885 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.581392 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.681606 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.782204 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.883376 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:57 crc kubenswrapper[5116]: E0322 00:09:57.984704 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.086424 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.186657 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.287098 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.387843 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.488702 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.589771 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.690019 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.697404 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.698095 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.698132 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.698145 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.698517 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.698726 5116 scope.go:117] "RemoveContainer" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.790080 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.891033 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.955836 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.957554 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8"} Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.957764 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.958415 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.958449 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:09:58 crc kubenswrapper[5116]: I0322 00:09:58.958461 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.958835 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:09:58 crc kubenswrapper[5116]: E0322 00:09:58.992045 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.092135 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.193116 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.294320 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.395270 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.495494 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.595676 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.696249 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.741411 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.797242 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.897696 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:09:59 crc kubenswrapper[5116]: E0322 00:09:59.998785 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.099856 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.200503 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.301158 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.401678 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.502435 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.602849 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.703959 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.805003 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.905963 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.964023 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.964674 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/2.log" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.966760 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" exitCode=255 Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.966801 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerDied","Data":"4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8"} Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.966836 5116 scope.go:117] "RemoveContainer" containerID="23ac57165ac4f4feba592e88c6e37bd923a9b307d259c6f668b5e9ac066da454" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.967142 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.968037 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.968096 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.968119 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.969106 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:10:00 crc kubenswrapper[5116]: I0322 00:10:00.969706 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:00 crc kubenswrapper[5116]: E0322 00:10:00.978317 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.006492 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.106846 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.207988 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.308373 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.409580 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.510380 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.611239 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.711815 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.812627 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: E0322 00:10:01.913784 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:01 crc kubenswrapper[5116]: I0322 00:10:01.969711 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.014880 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.115929 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.216951 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.317237 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.338147 5116 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.417726 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.518817 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.619247 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.719677 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.820161 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.871156 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.871506 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.872558 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.872645 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.872657 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.873306 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:10:02 crc kubenswrapper[5116]: I0322 00:10:02.873676 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.873960 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:10:02 crc kubenswrapper[5116]: E0322 00:10:02.920669 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.020788 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.121116 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.221540 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.322318 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.422422 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.523580 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.624398 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.725327 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.826253 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:03 crc kubenswrapper[5116]: E0322 00:10:03.927200 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.027331 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.127723 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.228403 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.328922 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.429540 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.529828 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.630664 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.731637 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.832576 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:04 crc kubenswrapper[5116]: E0322 00:10:04.933713 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.034076 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.134472 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.235638 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.336444 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.437359 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.537515 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.638024 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.739269 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.839913 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:05 crc kubenswrapper[5116]: E0322 00:10:05.940853 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.041942 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.142297 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.242474 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.342941 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.443960 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.544599 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.645249 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.745846 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.846723 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:06 crc kubenswrapper[5116]: E0322 00:10:06.947901 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.048778 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.148933 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.249818 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.350560 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.450688 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.551214 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.622984 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.627613 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.627648 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.627660 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.627675 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.627684 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:07Z","lastTransitionTime":"2026-03-22T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.640542 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.645399 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.645435 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.645444 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.645458 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.645467 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:07Z","lastTransitionTime":"2026-03-22T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.658189 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.661496 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.661563 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.661575 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.661591 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.661601 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:07Z","lastTransitionTime":"2026-03-22T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.671063 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.677152 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.677222 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.677234 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.677250 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:07 crc kubenswrapper[5116]: I0322 00:10:07.677263 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:07Z","lastTransitionTime":"2026-03-22T00:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.687792 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.687906 5116 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.687933 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.788902 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.889537 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:07 crc kubenswrapper[5116]: E0322 00:10:07.989640 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.089727 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.190220 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.291303 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.392528 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.493591 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.594507 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.694921 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.795307 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.895707 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.958717 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.959033 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.960127 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.960217 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.960236 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.960796 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:10:08 crc kubenswrapper[5116]: I0322 00:10:08.961105 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.961407 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:10:08 crc kubenswrapper[5116]: E0322 00:10:08.996816 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.097387 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.197974 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.298376 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.398755 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.499865 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.600776 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.701911 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.742322 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.802663 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:09 crc kubenswrapper[5116]: E0322 00:10:09.902878 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.003558 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.104053 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.205231 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.306343 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.406743 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.507052 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.607590 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.708534 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.809183 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:10 crc kubenswrapper[5116]: E0322 00:10:10.909372 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.009892 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.110345 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.210599 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.311249 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.411943 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.512837 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.613553 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.714730 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.815949 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:11 crc kubenswrapper[5116]: E0322 00:10:11.916828 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.017262 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.118253 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.218971 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.319216 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.420448 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.520598 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.621259 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.721779 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.822528 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:12 crc kubenswrapper[5116]: E0322 00:10:12.923368 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.024398 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.125315 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.225424 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.326512 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.427343 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.528518 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.629695 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.730573 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.831727 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:13 crc kubenswrapper[5116]: E0322 00:10:13.932769 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.033570 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.134644 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.235010 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.335529 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.436542 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.537022 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.638075 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.739156 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.840271 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:14 crc kubenswrapper[5116]: I0322 00:10:14.886336 5116 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:10:14 crc kubenswrapper[5116]: E0322 00:10:14.940946 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.041988 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.142447 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.243209 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.343850 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.443969 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.544544 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.645642 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: I0322 00:10:15.696772 5116 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 22 00:10:15 crc kubenswrapper[5116]: I0322 00:10:15.697981 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:15 crc kubenswrapper[5116]: I0322 00:10:15.698043 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:15 crc kubenswrapper[5116]: I0322 00:10:15.698057 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.698736 5116 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"crc\" not found" node="crc" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.746024 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.846383 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:15 crc kubenswrapper[5116]: E0322 00:10:15.947423 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.047587 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.148688 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.249647 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.350652 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.451211 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.552299 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.653404 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.754543 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.855773 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:16 crc kubenswrapper[5116]: E0322 00:10:16.956120 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.056338 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.157003 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.258047 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.358236 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.459007 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.560084 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.660696 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.761149 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.862208 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:17 crc kubenswrapper[5116]: E0322 00:10:17.962578 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.009259 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.014794 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.015149 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.015319 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.015485 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.015648 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:18Z","lastTransitionTime":"2026-03-22T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.024611 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.032405 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.032507 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.032527 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.032583 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.032602 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:18Z","lastTransitionTime":"2026-03-22T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.047906 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.051286 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.051349 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.051369 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.051394 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.051411 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:18Z","lastTransitionTime":"2026-03-22T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.061070 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.063908 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.063952 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.063966 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.063983 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:18 crc kubenswrapper[5116]: I0322 00:10:18.063994 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:18Z","lastTransitionTime":"2026-03-22T00:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.072994 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32400456Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32861256Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:be02784ed82978c399102be1c6c9f2ca441be4d984e0fd7100c155dd4417ebbf\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0d50962980a5aeecae2d99c98913fb0f46940164e41de0af2ba0e3dafe0d9017\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:541db5b20a3d2199602b3b5ac80f09ea31498034e9ae3841238b03a39150f0d7\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:b1c859067d6b7b785ab4977ed7137c5b3bb257234f7d7737a1d2836cef1576b5\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2599f32933f5fea6066ede54ad8f6150adb7bd9067892f251d5913121d5c630d\\\"],\\\"sizeBytes\\\":472771950},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:651bbe9d418f49c2c889d731df67cf5d88dff59dc03f5a1b5d4c8bb3ae001f1a\\\"],\\\"sizeBytes\\\":469976318},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4fe612a1572df462d6a4b664a10bc2e6cad239648acbf8c0303f8fca5d2596c0\\\"],\\\"sizeBytes\\\":468393024},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a5bb05344dd2296077f5066e908ede0eea23f5a12fb78ef86a9513c88d3faaca\\\"],\\\"sizeBytes\\\":464375011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\"],\\\"sizeBytes\\\":462844959}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4e17d39b-4bf4-4f5d-b01b-aaffc38eb890\\\",\\\"systemUUID\\\":\\\"6f1c9f09-bd93-4412-afb3-903004a8bcf7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.073246 5116 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.073299 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.174215 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.275160 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.375975 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.476555 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.576691 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.676994 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.778281 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.878697 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:18 crc kubenswrapper[5116]: E0322 00:10:18.979490 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.079647 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.180115 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.281065 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.382197 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.483113 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.583923 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.684244 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.743358 5116 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.784620 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.885244 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:19 crc kubenswrapper[5116]: E0322 00:10:19.986341 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.087381 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.187743 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.287917 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.388346 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.489256 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.589904 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.690549 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.791076 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.891889 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:20 crc kubenswrapper[5116]: E0322 00:10:20.992985 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.094120 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.194959 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.295942 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.396838 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.497661 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.597829 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.698183 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: E0322 00:10:21.798673 5116 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"crc\" not found" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.831045 5116 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.901820 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.901897 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.901911 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.901936 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.901953 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:21Z","lastTransitionTime":"2026-03-22T00:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.918965 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.932900 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 22 00:10:21 crc kubenswrapper[5116]: I0322 00:10:21.945379 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-etcd/etcd-crc" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.004399 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.004470 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.004485 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.004511 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.004531 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.047853 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.106992 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.107036 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.107045 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.107061 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.107072 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.146071 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.209588 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.209638 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.209651 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.209668 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.209680 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.311914 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.311963 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.311976 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.311994 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.312009 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.413743 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.413817 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.413842 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.413873 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.413896 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.516449 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.516499 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.516515 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.516531 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.516543 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.620412 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.620469 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.620484 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.620505 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.620521 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.628629 5116 apiserver.go:52] "Watching apiserver" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.635349 5116 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.635917 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-multus/multus-additional-cni-plugins-bk75f","openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5","openshift-kube-apiserver/kube-apiserver-crc","openshift-multus/multus-9sq6c","openshift-multus/network-metrics-daemon-wlq8c","openshift-network-operator/iptables-alerter-5jnd7","openshift-etcd/etcd-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-machine-config-operator/machine-config-daemon-66g6d","openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6","openshift-network-diagnostics/network-check-target-fhkjl","openshift-network-node-identity/network-node-identity-dgvkt","openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv","openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4","openshift-image-registry/node-ca-2rwjp","openshift-ovn-kubernetes/ovnkube-node-n9zvq","openshift-dns/node-resolver-nwnjb","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.637154 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.641859 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.641980 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.644529 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.644762 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.644546 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.645595 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.645827 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"metrics-tls\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.646457 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.646699 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.646828 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.647004 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"ovnkube-identity-cm\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.647136 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-node-identity\"/\"network-node-identity-cert\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.648843 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.648865 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"env-overrides\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.649227 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.663092 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.676292 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.680579 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.680667 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.680709 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.683011 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tk7bt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.683628 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.683854 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.688555 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.699093 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.701641 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.701642 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.701840 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.701719 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.702218 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g6kgg\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.704411 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.716510 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.723408 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.723466 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.723479 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.723499 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.723512 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.728415 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.728677 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.731538 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.731628 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"proxy-tls\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.731735 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.731759 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.731787 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-w9nzh\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.737294 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.737360 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.737679 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.738692 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-nl8tp\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.738994 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.739216 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.739347 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.739620 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.741852 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.741992 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.744671 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.748010 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.748073 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.748089 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.748735 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tjs74\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.749069 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.751470 5116 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.752068 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.752312 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.752350 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l2v2m\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.755541 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.756723 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.756923 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.757054 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.759921 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.759966 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nwglk\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.760783 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.768744 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780456 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d36c245b-3d7f-48eb-848e-c54198ae38a4-hosts-file\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780516 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-os-release\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780540 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-etc-kubernetes\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780570 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-kubelet\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780597 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-rootfs\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780626 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5zcr\" (UniqueName: \"kubernetes.io/projected/d36c245b-3d7f-48eb-848e-c54198ae38a4-kube-api-access-x5zcr\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780663 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-mcd-auth-proxy-config\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780740 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d36c245b-3d7f-48eb-848e-c54198ae38a4-tmp-dir\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780770 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-cnibin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780798 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-socket-dir-parent\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780847 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.780874 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781058 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781399 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781616 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-k8s-cni-cncf-io\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781651 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-netns\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781675 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-proxy-tls\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781698 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nt2j\" (UniqueName: \"kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.781717 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7xz2\" (UniqueName: \"kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782022 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782053 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-system-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782073 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782093 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782112 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-multus\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782128 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-multus-daemon-config\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782145 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-conf-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782180 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782211 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-bin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782230 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-hostroot\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782248 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-multus-certs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782263 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqrqs\" (UniqueName: \"kubernetes.io/projected/5188f25b-37c3-46f1-b939-199c6e082848-kube-api-access-sqrqs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782285 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782305 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pkhq\" (UniqueName: \"kubernetes.io/projected/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-kube-api-access-5pkhq\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782329 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782348 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-cni-binary-copy\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782368 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.782378 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.782543 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.282504674 +0000 UTC m=+94.304806047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.783469 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.782392 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsgwk\" (UniqueName: \"kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.783640 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.783673 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.783785 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.283755015 +0000 UTC m=+94.306056388 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.784198 5116 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.784946 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-env-overrides\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.785875 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-iptables-alerter-script\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.785926 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/fc4541ce-7789-4670-bc75-5c2868e52ce0-ovnkube-identity-cm\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.795223 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc4541ce-7789-4670-bc75-5c2868e52ce0-webhook-cert\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.795600 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.797706 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34177974-8d82-49d2-a763-391d0df3bbd8-metrics-tls\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.801639 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7xz2\" (UniqueName: \"kubernetes.io/projected/34177974-8d82-49d2-a763-391d0df3bbd8-kube-api-access-m7xz2\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.802216 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsgwk\" (UniqueName: \"kubernetes.io/projected/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-kube-api-access-dsgwk\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803286 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803312 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803324 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803340 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803364 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803380 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803409 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.303389678 +0000 UTC m=+94.325691051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.803447 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.303424359 +0000 UTC m=+94.325725732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.804501 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nt2j\" (UniqueName: \"kubernetes.io/projected/fc4541ce-7789-4670-bc75-5c2868e52ce0-kube-api-access-8nt2j\") pod \"network-node-identity-dgvkt\" (UID: \"fc4541ce-7789-4670-bc75-5c2868e52ce0\") " pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.807430 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.815914 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwnjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36c245b-3d7f-48eb-848e-c54198ae38a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwnjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.820680 5116 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.824531 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-9sq6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5188f25b-37c3-46f1-b939-199c6e082848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9sq6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.826639 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.826684 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.826697 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.826715 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.826728 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.838574 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-22T00:09:59Z\\\",\\\"message\\\":\\\"ar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"ClientsPreferCBOR\\\\\\\" enabled=false\\\\nW0322 00:09:59.459991 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0322 00:09:59.460237 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0322 00:09:59.461209 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2974540120/tls.crt::/tmp/serving-cert-2974540120/tls.key\\\\\\\"\\\\nI0322 00:09:59.961022 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0322 00:09:59.962668 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0322 00:09:59.962684 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0322 00:09:59.962706 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0322 00:09:59.962712 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0322 00:09:59.967903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0322 00:09:59.967930 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967945 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0322 00:09:59.967951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0322 00:09:59.967956 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0322 00:09:59.967963 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0322 00:09:59.967953 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0322 00:09:59.969056 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-22T00:09:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.850440 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.861453 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-66g6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.879192 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec484e57-1508-45a3-99a3-51dfa8ef6195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n9zvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.885671 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.885736 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.885762 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.885790 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.885813 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886002 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886052 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886082 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886104 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9stx\" (UniqueName: \"kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886129 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886150 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886186 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886213 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rmnv\" (UniqueName: \"kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886235 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886274 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886301 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886323 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886500 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886749 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887349 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887375 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887530 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887687 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887762 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp" (OuterVolumeSpecName: "tmp") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887724 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config" (OuterVolumeSpecName: "config") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887816 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887847 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887876 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887898 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887918 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks6v2\" (UniqueName: \"kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887969 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.887988 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dmhf\" (UniqueName: \"kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888009 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888027 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888044 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkdh6\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888062 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nspp\" (UniqueName: \"kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888084 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pddnv\" (UniqueName: \"kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv\") pod \"e093be35-bb62-4843-b2e8-094545761610\" (UID: \"e093be35-bb62-4843-b2e8-094545761610\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888106 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888125 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888142 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888160 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888195 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888191 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888216 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g4lr\" (UniqueName: \"kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr\") pod \"f7e2c886-118e-43bb-bef1-c78134de392b\" (UID: \"f7e2c886-118e-43bb-bef1-c78134de392b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888224 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888232 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888247 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jjkz\" (UniqueName: \"kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888361 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888400 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w94wk\" (UniqueName: \"kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888436 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888464 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzt4w\" (UniqueName: \"kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w\") pod \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\" (UID: \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888495 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888525 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888535 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz" (OuterVolumeSpecName: "kube-api-access-7jjkz") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "kube-api-access-7jjkz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888551 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888582 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vsz9\" (UniqueName: \"kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888582 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6" (OuterVolumeSpecName: "kube-api-access-tkdh6") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "kube-api-access-tkdh6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888598 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888612 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls\") pod \"42a11a02-47e1-488f-b270-2679d3298b0e\" (UID: \"42a11a02-47e1-488f-b270-2679d3298b0e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.888642 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889004 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889023 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls\") pod \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\" (UID: \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889059 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889091 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4tqq\" (UniqueName: \"kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq\") pod \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\" (UID: \"6ee8fbd3-1f81-4666-96da-5afc70819f1a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889113 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889138 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889161 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889196 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889206 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889261 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptkcf\" (UniqueName: \"kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889328 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889349 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnxbn\" (UniqueName: \"kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889369 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889393 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889415 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889443 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config\") pod \"c491984c-7d4b-44aa-8c1e-d7974424fa47\" (UID: \"c491984c-7d4b-44aa-8c1e-d7974424fa47\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889472 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889521 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfp5s\" (UniqueName: \"kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889543 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889565 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889587 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889607 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert\") pod \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\" (UID: \"a52afe44-fb37-46ed-a1f8-bf39727a3cbe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889635 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889646 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889654 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889507 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889754 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfzkj\" (UniqueName: \"kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj\") pod \"0effdbcf-dd7d-404d-9d48-77536d665a5d\" (UID: \"0effdbcf-dd7d-404d-9d48-77536d665a5d\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889789 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889813 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889834 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889831 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images" (OuterVolumeSpecName: "images") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889858 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dztfv\" (UniqueName: \"kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889887 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889908 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj4qr\" (UniqueName: \"kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr\") pod \"149b3c48-e17c-4a66-a835-d86dabf6ff13\" (UID: \"149b3c48-e17c-4a66-a835-d86dabf6ff13\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889930 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs\") pod \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\" (UID: \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889959 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.889983 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890009 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890034 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890060 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890069 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp" (OuterVolumeSpecName: "tmp") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890089 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsb9b\" (UniqueName: \"kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890105 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config" (OuterVolumeSpecName: "config") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890122 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890127 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config" (OuterVolumeSpecName: "config") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890154 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890202 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890205 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890228 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890261 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890289 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890312 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890333 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890356 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890380 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890412 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890429 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf" (OuterVolumeSpecName: "kube-api-access-6dmhf") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "kube-api-access-6dmhf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890437 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890452 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf" (OuterVolumeSpecName: "kube-api-access-ptkcf") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "kube-api-access-ptkcf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890510 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890537 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nb9c\" (UniqueName: \"kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890557 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m26jq\" (UniqueName: \"kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890574 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890601 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pskd\" (UniqueName: \"kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890628 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grwfz\" (UniqueName: \"kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890654 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddlk9\" (UniqueName: \"kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890673 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7cps\" (UniqueName: \"kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps\") pod \"af41de71-79cf-4590-bbe9-9e8b848862cb\" (UID: \"af41de71-79cf-4590-bbe9-9e8b848862cb\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890696 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws8zz\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890720 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890741 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890761 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890782 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890799 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890816 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890840 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftwb6\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6\") pod \"9f71a554-e414-4bc3-96d2-674060397afe\" (UID: \"9f71a554-e414-4bc3-96d2-674060397afe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890861 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890880 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities\") pod \"b605f283-6f2e-42da-a838-54421690f7d0\" (UID: \"b605f283-6f2e-42da-a838-54421690f7d0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890897 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890916 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert\") pod \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\" (UID: \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890933 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890952 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890969 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqbfk\" (UniqueName: \"kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890992 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891010 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891034 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891052 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca\") pod \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\" (UID: \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891073 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891091 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmmzf\" (UniqueName: \"kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf\") pod \"7df94c10-441d-4386-93a6-6730fb7bcde0\" (UID: \"7df94c10-441d-4386-93a6-6730fb7bcde0\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891109 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgrkj\" (UniqueName: \"kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj\") pod \"42a11a02-47e1-488f-b270-2679d3298b0e\" (UID: \"42a11a02-47e1-488f-b270-2679d3298b0e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891127 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891145 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99zj9\" (UniqueName: \"kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891188 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g8ts\" (UniqueName: \"kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891210 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891227 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891246 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twvbl\" (UniqueName: \"kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891262 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume\") pod \"92dfbade-90b6-4169-8c07-72cff7f2c82b\" (UID: \"92dfbade-90b6-4169-8c07-72cff7f2c82b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891282 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbc2l\" (UniqueName: \"kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891303 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891325 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs\") pod \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\" (UID: \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891346 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir\") pod \"f7e2c886-118e-43bb-bef1-c78134de392b\" (UID: \"f7e2c886-118e-43bb-bef1-c78134de392b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891371 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.886911 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv" (OuterVolumeSpecName: "kube-api-access-6rmnv") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "kube-api-access-6rmnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891394 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890631 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp" (OuterVolumeSpecName: "kube-api-access-8nspp") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "kube-api-access-8nspp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.890706 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891422 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca\") pod \"d7e8f42f-dc0e-424b-bb56-5ec849834888\" (UID: \"d7e8f42f-dc0e-424b-bb56-5ec849834888\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891445 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892210 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access\") pod \"7afa918d-be67-40a6-803c-d3b0ae99d815\" (UID: \"7afa918d-be67-40a6-803c-d3b0ae99d815\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892264 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892295 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892348 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lcfw\" (UniqueName: \"kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw\") pod \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\" (UID: \"dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892372 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892400 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892426 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert\") pod \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\" (UID: \"7fcc6409-8a0f-44c3-89e7-5aecd7610f8a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892455 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892481 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hckvg\" (UniqueName: \"kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892507 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892532 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892554 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5lgh\" (UniqueName: \"kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892580 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zth6t\" (UniqueName: \"kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892607 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892638 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892665 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pllx6\" (UniqueName: \"kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6\") pod \"81e39f7b-62e4-4fc9-992a-6535ce127a02\" (UID: \"81e39f7b-62e4-4fc9-992a-6535ce127a02\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892687 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892713 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892737 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content\") pod \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\" (UID: \"31fa8943-81cc-4750-a0b7-0fa9ab5af883\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892760 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892788 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892811 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg8nc\" (UniqueName: \"kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892837 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892862 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892885 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir\") pod \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\" (UID: \"f65c0ac1-8bca-454d-a2e6-e35cb418beac\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892909 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892930 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892954 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892978 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892999 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893026 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config\") pod \"7599e0b6-bddf-4def-b7f2-0b32206e8651\" (UID: \"7599e0b6-bddf-4def-b7f2-0b32206e8651\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893054 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893080 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4smf\" (UniqueName: \"kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf\") pod \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\" (UID: \"0dd0fbac-8c0d-4228-8faa-abbeedabf7db\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893104 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key\") pod \"ce090a97-9ab6-4c40-a719-64ff2acd9778\" (UID: \"ce090a97-9ab6-4c40-a719-64ff2acd9778\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893062 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe8361c-9ce7-48cd-9142-ae635e1b27d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5f2712849d12be66837cdb73e0858adfc9b9172ec8bbe1d83c7fe3fcc4bc8fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c2b67442f55addc55a7f281f1b6a6441e1d6068e6f826dedb686cdc29ef2ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://609c703ea4eb165b2fdc69a88571190199d2dd374ea3629e50d7e6091c4552b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893126 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca\") pod \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\" (UID: \"b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893430 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hb7m\" (UniqueName: \"kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m\") pod \"94a6e063-3d1a-4d44-875d-185291448c31\" (UID: \"94a6e063-3d1a-4d44-875d-185291448c31\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893460 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893488 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir\") pod \"6077b63e-53a2-4f96-9d56-1ce0324e4913\" (UID: \"6077b63e-53a2-4f96-9d56-1ce0324e4913\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893511 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893532 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893554 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert\") pod \"c5f2bfad-70f6-4185-a3d9-81ce12720767\" (UID: \"c5f2bfad-70f6-4185-a3d9-81ce12720767\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893574 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxfcv\" (UniqueName: \"kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv\") pod \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\" (UID: \"9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893592 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert\") pod \"a7a88189-c967-4640-879e-27665747f20c\" (UID: \"a7a88189-c967-4640-879e-27665747f20c\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893610 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs\") pod \"301e1965-1754-483d-b6cc-bfae7038bbca\" (UID: \"301e1965-1754-483d-b6cc-bfae7038bbca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893630 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z4sw\" (UniqueName: \"kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw\") pod \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\" (UID: \"e1d2a42d-af1d-4054-9618-ab545e0ed8b7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893649 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893670 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l87hs\" (UniqueName: \"kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs\") pod \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\" (UID: \"5ebfebf6-3ecd-458e-943f-bb25b52e2718\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893690 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config\") pod \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\" (UID: \"fc8db2c7-859d-47b3-a900-2bd0c0b2973b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893712 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm9x7\" (UniqueName: \"kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893731 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities\") pod \"cc85e424-18b2-4924-920b-bd291a8c4b01\" (UID: \"cc85e424-18b2-4924-920b-bd291a8c4b01\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893754 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893784 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls\") pod \"b4750666-1362-4001-abd0-6f89964cc621\" (UID: \"b4750666-1362-4001-abd0-6f89964cc621\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893815 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94l9h\" (UniqueName: \"kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h\") pod \"16bdd140-dce1-464c-ab47-dd5798d1d256\" (UID: \"16bdd140-dce1-464c-ab47-dd5798d1d256\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893843 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles\") pod \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\" (UID: \"a555ff2e-0be6-46d5-897d-863bb92ae2b3\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893938 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893970 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893997 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894027 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894056 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tknt7\" (UniqueName: \"kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7\") pod \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\" (UID: \"584e1f4a-8205-47d7-8efb-3afc6017c4c9\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894083 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert\") pod \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\" (UID: \"6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894113 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjwtd\" (UniqueName: \"kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894140 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config\") pod \"2325ffef-9d5b-447f-b00e-3efc429acefe\" (UID: \"2325ffef-9d5b-447f-b00e-3efc429acefe\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894192 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgx6b\" (UniqueName: \"kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b\") pod \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\" (UID: \"f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894223 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894256 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbmqg\" (UniqueName: \"kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg\") pod \"18f80adb-c1c3-49ba-8ee4-932c851d3897\" (UID: \"18f80adb-c1c3-49ba-8ee4-932c851d3897\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894290 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca\") pod \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\" (UID: \"20ce4d18-fe25-4696-ad7c-1bd2d6200a3e\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894326 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token\") pod \"593a3561-7760-45c5-8f91-5aaef7475d0f\" (UID: \"593a3561-7760-45c5-8f91-5aaef7475d0f\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894367 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca\") pod \"567683bd-0efc-4f21-b076-e28559628404\" (UID: \"567683bd-0efc-4f21-b076-e28559628404\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894394 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config\") pod \"736c54fe-349c-4bb9-870a-d1c1d1c03831\" (UID: \"736c54fe-349c-4bb9-870a-d1c1d1c03831\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894428 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config\") pod \"01080b46-74f1-4191-8755-5152a57b3b25\" (UID: \"01080b46-74f1-4191-8755-5152a57b3b25\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896566 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5rsr\" (UniqueName: \"kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr\") pod \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\" (UID: \"af33e427-6803-48c2-a76a-dd9deb7cbf9a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896635 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26xrl\" (UniqueName: \"kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl\") pod \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\" (UID: \"a208c9c2-333b-4b4a-be0d-bc32ec38a821\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896671 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection\") pod \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\" (UID: \"6edfcf45-925b-4eff-b940-95b6fc0b85d4\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896720 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities\") pod \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\" (UID: \"71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896756 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896805 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config\") pod \"d565531a-ff86-4608-9d19-767de01ac31b\" (UID: \"d565531a-ff86-4608-9d19-767de01ac31b\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896857 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp\") pod \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\" (UID: \"d45be74c-0d98-4d18-90e4-f7ef1b6daaf7\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896890 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896929 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896965 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896999 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist\") pod \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\" (UID: \"869851b9-7ffb-4af0-b166-1d8aa40a5f80\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897031 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config\") pod \"d19cb085-0c5b-4810-b654-ce7923221d90\" (UID: \"d19cb085-0c5b-4810-b654-ce7923221d90\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897062 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies\") pod \"f559dfa3-3917-43a2-97f6-61ddfda10e93\" (UID: \"f559dfa3-3917-43a2-97f6-61ddfda10e93\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897105 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config\") pod \"09cfa50b-4138-4585-a53e-64dd3ab73335\" (UID: \"09cfa50b-4138-4585-a53e-64dd3ab73335\") " Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897246 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d36c245b-3d7f-48eb-848e-c54198ae38a4-hosts-file\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897287 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-os-release\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897316 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-etc-kubernetes\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897348 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-kubelet\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897376 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-rootfs\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897411 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qscfn\" (UniqueName: \"kubernetes.io/projected/1811891e-33d0-4500-a481-0e4aa2d3e95c-kube-api-access-qscfn\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897439 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897480 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5zcr\" (UniqueName: \"kubernetes.io/projected/d36c245b-3d7f-48eb-848e-c54198ae38a4-kube-api-access-x5zcr\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897511 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-mcd-auth-proxy-config\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897538 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897568 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntdv4\" (UniqueName: \"kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897746 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d36c245b-3d7f-48eb-848e-c54198ae38a4-tmp-dir\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897782 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-cnibin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897816 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-socket-dir-parent\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897855 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1811891e-33d0-4500-a481-0e4aa2d3e95c-serviceca\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897885 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897967 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-k8s-cni-cncf-io\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898006 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-netns\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898034 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-proxy-tls\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898079 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-system-cni-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898119 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-cnibin\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898143 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1811891e-33d0-4500-a481-0e4aa2d3e95c-host\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898206 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898238 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898263 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898292 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898350 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-system-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898396 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7gh6\" (UniqueName: \"kubernetes.io/projected/68dcbc21-b4ce-4285-9a4b-101724f82f33-kube-api-access-b7gh6\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898427 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898490 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-multus\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898532 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-multus-daemon-config\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898565 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898592 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-binary-copy\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898630 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898660 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898690 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-conf-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898730 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898759 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-bin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898785 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-hostroot\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898811 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898841 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-multus-certs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898883 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqrqs\" (UniqueName: \"kubernetes.io/projected/5188f25b-37c3-46f1-b939-199c6e082848-kube-api-access-sqrqs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898943 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pkhq\" (UniqueName: \"kubernetes.io/projected/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-kube-api-access-5pkhq\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898975 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899012 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-os-release\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899050 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899101 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzzhb\" (UniqueName: \"kubernetes.io/projected/94c19a90-c2c9-4236-98be-a0516dbb840b-kube-api-access-lzzhb\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899146 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899202 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899230 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899254 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899282 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qp9r\" (UniqueName: \"kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899318 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-cni-binary-copy\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899349 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899376 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899404 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899431 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899470 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899498 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899523 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899552 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899589 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899625 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899751 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dmhf\" (UniqueName: \"kubernetes.io/projected/736c54fe-349c-4bb9-870a-d1c1d1c03831-kube-api-access-6dmhf\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899776 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkdh6\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-kube-api-access-tkdh6\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899793 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f65c0ac1-8bca-454d-a2e6-e35cb418beac-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899810 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09cfa50b-4138-4585-a53e-64dd3ab73335-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899824 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899840 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899860 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7jjkz\" (UniqueName: \"kubernetes.io/projected/301e1965-1754-483d-b6cc-bfae7038bbca-kube-api-access-7jjkz\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899876 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7afa918d-be67-40a6-803c-d3b0ae99d815-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899892 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ptkcf\" (UniqueName: \"kubernetes.io/projected/7599e0b6-bddf-4def-b7f2-0b32206e8651-kube-api-access-ptkcf\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899909 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a555ff2e-0be6-46d5-897d-863bb92ae2b3-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899928 5116 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899944 5116 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899958 5116 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899973 5116 reconciler_common.go:299] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899989 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900005 5116 reconciler_common.go:299] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900020 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900037 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c5f2bfad-70f6-4185-a3d9-81ce12720767-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900056 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900070 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900086 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rmnv\" (UniqueName: \"kubernetes.io/projected/b605f283-6f2e-42da-a838-54421690f7d0-kube-api-access-6rmnv\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900101 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7afa918d-be67-40a6-803c-d3b0ae99d815-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900115 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7afa918d-be67-40a6-803c-d3b0ae99d815-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900129 5116 reconciler_common.go:299] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900145 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16bdd140-dce1-464c-ab47-dd5798d1d256-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900160 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900217 5116 reconciler_common.go:299] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-images\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900314 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/34177974-8d82-49d2-a763-391d0df3bbd8-host-etc-kube\") pod \"network-operator-7bdcf4f5bd-7fjxv\" (UID: \"34177974-8d82-49d2-a763-391d0df3bbd8\") " pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902869 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-conf-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903359 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903412 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d36c245b-3d7f-48eb-848e-c54198ae38a4-tmp-dir\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903432 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-bin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903473 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-hostroot\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903485 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-cnibin\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903523 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-multus-certs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903545 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-multus-socket-dir-parent\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903613 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-k8s-cni-cncf-io\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903734 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-system-cni-dir\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903805 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-cni-multus\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.905421 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-run-netns\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.905698 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-multus-daemon-config\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.906560 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5188f25b-37c3-46f1-b939-199c6e082848-cni-binary-copy\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.906780 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/428b39f5-eb1c-4f65-b7a4-eeb6e84860cc-host-slash\") pod \"iptables-alerter-5jnd7\" (UID: \"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\") " pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891297 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891369 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891761 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2" (OuterVolumeSpecName: "kube-api-access-ks6v2") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "kube-api-access-ks6v2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891918 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913504 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-host-var-lib-kubelet\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913558 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts" (OuterVolumeSpecName: "kube-api-access-4g8ts") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "kube-api-access-4g8ts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.891967 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892126 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b" (OuterVolumeSpecName: "kube-api-access-zsb9b") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "kube-api-access-zsb9b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892270 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv" (OuterVolumeSpecName: "kube-api-access-dztfv") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "kube-api-access-dztfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892478 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv" (OuterVolumeSpecName: "kube-api-access-pddnv") pod "e093be35-bb62-4843-b2e8-094545761610" (UID: "e093be35-bb62-4843-b2e8-094545761610"). InnerVolumeSpecName "kube-api-access-pddnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892773 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s" (OuterVolumeSpecName: "kube-api-access-xfp5s") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "kube-api-access-xfp5s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913809 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config" (OuterVolumeSpecName: "config") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913854 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume" (OuterVolumeSpecName: "config-volume") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892850 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892868 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn" (OuterVolumeSpecName: "kube-api-access-xnxbn") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "kube-api-access-xnxbn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.892993 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893051 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx" (OuterVolumeSpecName: "kube-api-access-l9stx") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "kube-api-access-l9stx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893203 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config" (OuterVolumeSpecName: "console-config") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893677 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr" (OuterVolumeSpecName: "kube-api-access-6g4lr") pod "f7e2c886-118e-43bb-bef1-c78134de392b" (UID: "f7e2c886-118e-43bb-bef1-c78134de392b"). InnerVolumeSpecName "kube-api-access-6g4lr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.893843 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894258 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894423 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7afa918d-be67-40a6-803c-d3b0ae99d815" (UID: "7afa918d-be67-40a6-803c-d3b0ae99d815"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894493 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894649 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities" (OuterVolumeSpecName: "utilities") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.894812 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config" (OuterVolumeSpecName: "config") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.895071 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "42a11a02-47e1-488f-b270-2679d3298b0e" (UID: "42a11a02-47e1-488f-b270-2679d3298b0e"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.895337 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp" (OuterVolumeSpecName: "tmp") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.895560 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk" (OuterVolumeSpecName: "kube-api-access-w94wk") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "kube-api-access-w94wk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896407 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896440 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.895976 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config" (OuterVolumeSpecName: "config") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.895667 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config" (OuterVolumeSpecName: "config") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896552 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities" (OuterVolumeSpecName: "utilities") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.896659 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897828 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities" (OuterVolumeSpecName: "utilities") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.897889 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898191 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca" (OuterVolumeSpecName: "client-ca") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898263 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" (UID: "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898389 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem" (OuterVolumeSpecName: "ca-trust-extracted-pem") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "ca-trust-extracted-pem". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.898628 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca" (OuterVolumeSpecName: "client-ca") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899471 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr" (OuterVolumeSpecName: "kube-api-access-wj4qr") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "kube-api-access-wj4qr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899584 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m" (OuterVolumeSpecName: "kube-api-access-4hb7m") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "kube-api-access-4hb7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899604 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899722 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.899885 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900014 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900191 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj" (OuterVolumeSpecName: "kube-api-access-mfzkj") pod "0effdbcf-dd7d-404d-9d48-77536d665a5d" (UID: "0effdbcf-dd7d-404d-9d48-77536d665a5d"). InnerVolumeSpecName "kube-api-access-mfzkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900220 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900230 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900326 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp" (OuterVolumeSpecName: "tmp") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900860 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config" (OuterVolumeSpecName: "config") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: E0322 00:10:22.900907 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.400884681 +0000 UTC m=+94.423186054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.914538 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l" (OuterVolumeSpecName: "kube-api-access-sbc2l") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "kube-api-access-sbc2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900901 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities" (OuterVolumeSpecName: "utilities") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900915 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901107 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert" (OuterVolumeSpecName: "cert") pod "a52afe44-fb37-46ed-a1f8-bf39727a3cbe" (UID: "a52afe44-fb37-46ed-a1f8-bf39727a3cbe"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901541 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901547 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp" (OuterVolumeSpecName: "tmp") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.900832 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901657 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901527 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901695 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901980 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9" (OuterVolumeSpecName: "kube-api-access-9vsz9") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "kube-api-access-9vsz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901986 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd" (OuterVolumeSpecName: "kube-api-access-8pskd") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "kube-api-access-8pskd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901983 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901999 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit" (OuterVolumeSpecName: "audit") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902012 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config" (OuterVolumeSpecName: "config") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.914670 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.901862 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "c491984c-7d4b-44aa-8c1e-d7974424fa47" (UID: "c491984c-7d4b-44aa-8c1e-d7974424fa47"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902389 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq" (OuterVolumeSpecName: "kube-api-access-m26jq") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "kube-api-access-m26jq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902435 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv" (OuterVolumeSpecName: "kube-api-access-xxfcv") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "kube-api-access-xxfcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902548 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902714 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq" (OuterVolumeSpecName: "kube-api-access-d4tqq") pod "6ee8fbd3-1f81-4666-96da-5afc70819f1a" (UID: "6ee8fbd3-1f81-4666-96da-5afc70819f1a"). InnerVolumeSpecName "kube-api-access-d4tqq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902614 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.902853 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903032 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903066 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903089 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6" (OuterVolumeSpecName: "kube-api-access-ftwb6") pod "9f71a554-e414-4bc3-96d2-674060397afe" (UID: "9f71a554-e414-4bc3-96d2-674060397afe"). InnerVolumeSpecName "kube-api-access-ftwb6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903150 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.914784 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca" (OuterVolumeSpecName: "service-ca") pod "d7e8f42f-dc0e-424b-bb56-5ec849834888" (UID: "d7e8f42f-dc0e-424b-bb56-5ec849834888"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.914808 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj" (OuterVolumeSpecName: "kube-api-access-qgrkj") pod "42a11a02-47e1-488f-b270-2679d3298b0e" (UID: "42a11a02-47e1-488f-b270-2679d3298b0e"). InnerVolumeSpecName "kube-api-access-qgrkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903379 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903395 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903393 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config" (OuterVolumeSpecName: "config") pod "c5f2bfad-70f6-4185-a3d9-81ce12720767" (UID: "c5f2bfad-70f6-4185-a3d9-81ce12720767"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.903559 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9" (OuterVolumeSpecName: "kube-api-access-ddlk9") pod "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" (UID: "7fcc6409-8a0f-44c3-89e7-5aecd7610f8a"). InnerVolumeSpecName "kube-api-access-ddlk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.904296 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.904136 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities" (OuterVolumeSpecName: "utilities") pod "b605f283-6f2e-42da-a838-54421690f7d0" (UID: "b605f283-6f2e-42da-a838-54421690f7d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.904359 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw" (OuterVolumeSpecName: "kube-api-access-5lcfw") pod "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" (UID: "dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9"). InnerVolumeSpecName "kube-api-access-5lcfw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.904374 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.904400 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.908449 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.908755 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909095 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909146 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909462 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps" (OuterVolumeSpecName: "kube-api-access-d7cps") pod "af41de71-79cf-4590-bbe9-9e8b848862cb" (UID: "af41de71-79cf-4590-bbe9-9e8b848862cb"). InnerVolumeSpecName "kube-api-access-d7cps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909919 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b" (OuterVolumeSpecName: "kube-api-access-pgx6b") pod "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" (UID: "f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4"). InnerVolumeSpecName "kube-api-access-pgx6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909881 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk" (OuterVolumeSpecName: "kube-api-access-qqbfk") pod "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" (UID: "b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a"). InnerVolumeSpecName "kube-api-access-qqbfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.909976 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf" (OuterVolumeSpecName: "kube-api-access-nmmzf") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "kube-api-access-nmmzf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910023 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910296 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh" (OuterVolumeSpecName: "kube-api-access-m5lgh") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "kube-api-access-m5lgh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910371 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910391 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz" (OuterVolumeSpecName: "kube-api-access-grwfz") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "kube-api-access-grwfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910667 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg" (OuterVolumeSpecName: "kube-api-access-hckvg") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "kube-api-access-hckvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910711 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910708 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config" (OuterVolumeSpecName: "config") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.910743 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config" (OuterVolumeSpecName: "config") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911069 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911065 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911083 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs" (OuterVolumeSpecName: "kube-api-access-l87hs") pod "5ebfebf6-3ecd-458e-943f-bb25b52e2718" (UID: "5ebfebf6-3ecd-458e-943f-bb25b52e2718"). InnerVolumeSpecName "kube-api-access-l87hs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911258 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911434 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw" (OuterVolumeSpecName: "kube-api-access-9z4sw") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "kube-api-access-9z4sw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911454 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01080b46-74f1-4191-8755-5152a57b3b25" (UID: "01080b46-74f1-4191-8755-5152a57b3b25"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911755 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd" (OuterVolumeSpecName: "kube-api-access-mjwtd") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "kube-api-access-mjwtd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911824 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca" (OuterVolumeSpecName: "serviceca") pod "5ebfebf6-3ecd-458e-943f-bb25b52e2718" (UID: "5ebfebf6-3ecd-458e-943f-bb25b52e2718"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.911985 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h" (OuterVolumeSpecName: "kube-api-access-94l9h") pod "16bdd140-dce1-464c-ab47-dd5798d1d256" (UID: "16bdd140-dce1-464c-ab47-dd5798d1d256"). InnerVolumeSpecName "kube-api-access-94l9h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912425 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t" (OuterVolumeSpecName: "kube-api-access-zth6t") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "kube-api-access-zth6t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912397 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "92dfbade-90b6-4169-8c07-72cff7f2c82b" (UID: "92dfbade-90b6-4169-8c07-72cff7f2c82b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912510 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "0dd0fbac-8c0d-4228-8faa-abbeedabf7db" (UID: "0dd0fbac-8c0d-4228-8faa-abbeedabf7db"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912464 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912595 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912580 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.912686 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl" (OuterVolumeSpecName: "kube-api-access-twvbl") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "kube-api-access-twvbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913121 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities" (OuterVolumeSpecName: "utilities") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.913269 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config" (OuterVolumeSpecName: "config") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.915534 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9" (OuterVolumeSpecName: "kube-api-access-99zj9") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "kube-api-access-99zj9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.915954 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.916035 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-proxy-tls\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.916052 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-rootfs\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.916447 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.916564 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.917024 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities" (OuterVolumeSpecName: "utilities") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.917341 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc" (OuterVolumeSpecName: "kube-api-access-zg8nc") pod "2325ffef-9d5b-447f-b00e-3efc429acefe" (UID: "2325ffef-9d5b-447f-b00e-3efc429acefe"). InnerVolumeSpecName "kube-api-access-zg8nc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.917609 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" (UID: "9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.917771 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6" (OuterVolumeSpecName: "kube-api-access-pllx6") pod "81e39f7b-62e4-4fc9-992a-6535ce127a02" (UID: "81e39f7b-62e4-4fc9-992a-6535ce127a02"). InnerVolumeSpecName "kube-api-access-pllx6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.918088 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e1d2a42d-af1d-4054-9618-ab545e0ed8b7" (UID: "e1d2a42d-af1d-4054-9618-ab545e0ed8b7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.918243 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.918413 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7599e0b6-bddf-4def-b7f2-0b32206e8651" (UID: "7599e0b6-bddf-4def-b7f2-0b32206e8651"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.918691 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc85e424-18b2-4924-920b-bd291a8c4b01" (UID: "cc85e424-18b2-4924-920b-bd291a8c4b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.918746 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities" (OuterVolumeSpecName: "utilities") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919320 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919410 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images" (OuterVolumeSpecName: "images") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919547 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919788 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "f65c0ac1-8bca-454d-a2e6-e35cb418beac" (UID: "f65c0ac1-8bca-454d-a2e6-e35cb418beac"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919887 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "6077b63e-53a2-4f96-9d56-1ce0324e4913" (UID: "6077b63e-53a2-4f96-9d56-1ce0324e4913"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919916 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-os-release\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.919959 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5188f25b-37c3-46f1-b939-199c6e082848-etc-kubernetes\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.920261 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d36c245b-3d7f-48eb-848e-c54198ae38a4-hosts-file\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.920241 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b44f09af-f7e2-4bcb-bdba-55ff8b81f5de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://5c4690216bec52133687895e582b4bb77c5e98aa64b321ad9c9d506f55d3f00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2600f5226a6713d12505ffd457e56ebd7d088161c72b9c8559bbea8a975297de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://79bb9087b191778249a1a55111db0643e3b60d19f757f975df02d11b33c9c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://bfe93ca3eb874b172d7e5a437f62f0a3794ec8d1de985eadc78f9c5dc076e580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://88776b5f47aa00647e827ac8c99416f29940ce9d6589329425a92597ab26ec51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:53Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.920594 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.920728 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-mcd-auth-proxy-config\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921071 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w" (OuterVolumeSpecName: "kube-api-access-rzt4w") pod "a52afe44-fb37-46ed-a1f8-bf39727a3cbe" (UID: "a52afe44-fb37-46ed-a1f8-bf39727a3cbe"). InnerVolumeSpecName "kube-api-access-rzt4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921125 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7df94c10-441d-4386-93a6-6730fb7bcde0" (UID: "7df94c10-441d-4386-93a6-6730fb7bcde0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921325 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs" (OuterVolumeSpecName: "certs") pod "593a3561-7760-45c5-8f91-5aaef7475d0f" (UID: "593a3561-7760-45c5-8f91-5aaef7475d0f"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921361 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "567683bd-0efc-4f21-b076-e28559628404" (UID: "567683bd-0efc-4f21-b076-e28559628404"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921408 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921632 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key" (OuterVolumeSpecName: "signing-key") pod "ce090a97-9ab6-4c40-a719-64ff2acd9778" (UID: "ce090a97-9ab6-4c40-a719-64ff2acd9778"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921879 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.922323 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.922482 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "a7a88189-c967-4640-879e-27665747f20c" (UID: "a7a88189-c967-4640-879e-27665747f20c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.922640 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.921533 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" (UID: "20ce4d18-fe25-4696-ad7c-1bd2d6200a3e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.923796 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7" (OuterVolumeSpecName: "kube-api-access-hm9x7") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "kube-api-access-hm9x7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.924400 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c" (OuterVolumeSpecName: "kube-api-access-8nb9c") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "kube-api-access-8nb9c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.930939 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "b4750666-1362-4001-abd0-6f89964cc621" (UID: "b4750666-1362-4001-abd0-6f89964cc621"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.931578 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932119 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932152 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "6ee8fbd3-1f81-4666-96da-5afc70819f1a" (UID: "6ee8fbd3-1f81-4666-96da-5afc70819f1a"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932184 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf" (OuterVolumeSpecName: "kube-api-access-q4smf") pod "0dd0fbac-8c0d-4228-8faa-abbeedabf7db" (UID: "0dd0fbac-8c0d-4228-8faa-abbeedabf7db"). InnerVolumeSpecName "kube-api-access-q4smf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932511 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932794 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d565531a-ff86-4608-9d19-767de01ac31b" (UID: "d565531a-ff86-4608-9d19-767de01ac31b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932807 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.932876 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.933104 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pkhq\" (UniqueName: \"kubernetes.io/projected/9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3-kube-api-access-5pkhq\") pod \"machine-config-daemon-66g6d\" (UID: \"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\") " pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.933264 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7" (OuterVolumeSpecName: "kube-api-access-tknt7") pod "584e1f4a-8205-47d7-8efb-3afc6017c4c9" (UID: "584e1f4a-8205-47d7-8efb-3afc6017c4c9"). InnerVolumeSpecName "kube-api-access-tknt7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.933337 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqrqs\" (UniqueName: \"kubernetes.io/projected/5188f25b-37c3-46f1-b939-199c6e082848-kube-api-access-sqrqs\") pod \"multus-9sq6c\" (UID: \"5188f25b-37c3-46f1-b939-199c6e082848\") " pod="openshift-multus/multus-9sq6c" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.934568 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "f7e2c886-118e-43bb-bef1-c78134de392b" (UID: "f7e2c886-118e-43bb-bef1-c78134de392b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.934570 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "301e1965-1754-483d-b6cc-bfae7038bbca" (UID: "301e1965-1754-483d-b6cc-bfae7038bbca"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.934803 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg" (OuterVolumeSpecName: "kube-api-access-wbmqg") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "kube-api-access-wbmqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.934924 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr" (OuterVolumeSpecName: "kube-api-access-z5rsr") pod "af33e427-6803-48c2-a76a-dd9deb7cbf9a" (UID: "af33e427-6803-48c2-a76a-dd9deb7cbf9a"). InnerVolumeSpecName "kube-api-access-z5rsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935179 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935354 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935475 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "736c54fe-349c-4bb9-870a-d1c1d1c03831" (UID: "736c54fe-349c-4bb9-870a-d1c1d1c03831"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935428 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935779 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935817 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935821 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "d19cb085-0c5b-4810-b654-ce7923221d90" (UID: "d19cb085-0c5b-4810-b654-ce7923221d90"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935833 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp" (OuterVolumeSpecName: "tmp") pod "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" (UID: "d45be74c-0d98-4d18-90e4-f7ef1b6daaf7"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935831 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.935897 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap" (OuterVolumeSpecName: "whereabouts-flatfile-configmap") pod "869851b9-7ffb-4af0-b166-1d8aa40a5f80" (UID: "869851b9-7ffb-4af0-b166-1d8aa40a5f80"). InnerVolumeSpecName "whereabouts-flatfile-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.936148 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.936225 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:22Z","lastTransitionTime":"2026-03-22T00:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.936601 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz" (OuterVolumeSpecName: "kube-api-access-ws8zz") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "kube-api-access-ws8zz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.936603 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config" (OuterVolumeSpecName: "config") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937092 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config" (OuterVolumeSpecName: "config") pod "09cfa50b-4138-4585-a53e-64dd3ab73335" (UID: "09cfa50b-4138-4585-a53e-64dd3ab73335"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937289 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "a208c9c2-333b-4b4a-be0d-bc32ec38a821" (UID: "a208c9c2-333b-4b4a-be0d-bc32ec38a821"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937384 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a555ff2e-0be6-46d5-897d-863bb92ae2b3" (UID: "a555ff2e-0be6-46d5-897d-863bb92ae2b3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937505 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f559dfa3-3917-43a2-97f6-61ddfda10e93" (UID: "f559dfa3-3917-43a2-97f6-61ddfda10e93"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937584 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937657 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl" (OuterVolumeSpecName: "kube-api-access-26xrl") pod "a208c9c2-333b-4b4a-be0d-bc32ec38a821" (UID: "a208c9c2-333b-4b4a-be0d-bc32ec38a821"). InnerVolumeSpecName "kube-api-access-26xrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937659 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca" (OuterVolumeSpecName: "service-ca") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.937961 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "18f80adb-c1c3-49ba-8ee4-932c851d3897" (UID: "18f80adb-c1c3-49ba-8ee4-932c851d3897"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.938073 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5zcr\" (UniqueName: \"kubernetes.io/projected/d36c245b-3d7f-48eb-848e-c54198ae38a4-kube-api-access-x5zcr\") pod \"node-resolver-nwnjb\" (UID: \"d36c245b-3d7f-48eb-848e-c54198ae38a4\") " pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.938418 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" (UID: "6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.939775 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "fc8db2c7-859d-47b3-a900-2bd0c0b2973b" (UID: "fc8db2c7-859d-47b3-a900-2bd0c0b2973b"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.939909 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.939931 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.940658 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "6edfcf45-925b-4eff-b940-95b6fc0b85d4" (UID: "6edfcf45-925b-4eff-b940-95b6fc0b85d4"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.941235 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "149b3c48-e17c-4a66-a835-d86dabf6ff13" (UID: "149b3c48-e17c-4a66-a835-d86dabf6ff13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.951183 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31fa8943-81cc-4750-a0b7-0fa9ab5af883" (UID: "31fa8943-81cc-4750-a0b7-0fa9ab5af883"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.951751 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.961182 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.965372 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-9sq6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5188f25b-37c3-46f1-b939-199c6e082848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9sq6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.967158 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" (UID: "71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.968925 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-dgvkt" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.977438 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17ab744-68a7-4a24-8ef2-556696d752fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-bd7p4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.980359 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5jnd7" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.982726 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.986191 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2rwjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1811891e-33d0-4500-a481-0e4aa2d3e95c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qscfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2rwjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.987247 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94a6e063-3d1a-4d44-875d-185291448c31" (UID: "94a6e063-3d1a-4d44-875d-185291448c31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:10:22 crc kubenswrapper[5116]: I0322 00:10:22.994723 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nwnjb" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.000963 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-os-release\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.000997 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001016 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzzhb\" (UniqueName: \"kubernetes.io/projected/94c19a90-c2c9-4236-98be-a0516dbb840b-kube-api-access-lzzhb\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001037 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001056 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001074 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001093 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001094 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-os-release\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001112 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qp9r\" (UniqueName: \"kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001239 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001282 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001320 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001352 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001383 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001393 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001424 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001456 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001487 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001533 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qscfn\" (UniqueName: \"kubernetes.io/projected/1811891e-33d0-4500-a481-0e4aa2d3e95c-kube-api-access-qscfn\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001565 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001603 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001639 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntdv4\" (UniqueName: \"kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001681 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1811891e-33d0-4500-a481-0e4aa2d3e95c-serviceca\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001721 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001787 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-system-cni-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001830 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-cnibin\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001864 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1811891e-33d0-4500-a481-0e4aa2d3e95c-host\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001871 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001896 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001946 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.001950 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002010 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1811891e-33d0-4500-a481-0e4aa2d3e95c-host\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002018 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002025 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002050 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002076 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002115 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002152 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002790 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002903 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7gh6\" (UniqueName: \"kubernetes.io/projected/68dcbc21-b4ce-4285-9a4b-101724f82f33-kube-api-access-b7gh6\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.002957 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003032 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003087 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-binary-copy\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003137 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003201 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003254 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003519 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.003617 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.004191 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.004292 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.004363 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.004584 5116 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.010580 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.010592 5116 reconciler_common.go:299] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.006281 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.008449 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.008478 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-system-cni-dir\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.008517 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68dcbc21-b4ce-4285-9a4b-101724f82f33-cnibin\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.008544 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.008569 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.009270 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.009318 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68dcbc21-b4ce-4285-9a4b-101724f82f33-cni-binary-copy\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.009337 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.009441 5116 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.010251 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.010421 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.007397 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.005011 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dcbc21-b4ce-4285-9a4b-101724f82f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bk75f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.011968 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs podName:94c19a90-c2c9-4236-98be-a0516dbb840b nodeName:}" failed. No retries permitted until 2026-03-22 00:10:23.511948195 +0000 UTC m=+94.534249568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs") pod "network-metrics-daemon-wlq8c" (UID: "94c19a90-c2c9-4236-98be-a0516dbb840b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012068 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012255 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012377 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ks6v2\" (UniqueName: \"kubernetes.io/projected/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-kube-api-access-ks6v2\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012408 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012453 5116 reconciler_common.go:299] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/301e1965-1754-483d-b6cc-bfae7038bbca-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012492 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nspp\" (UniqueName: \"kubernetes.io/projected/a7a88189-c967-4640-879e-27665747f20c-kube-api-access-8nspp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.012585 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pddnv\" (UniqueName: \"kubernetes.io/projected/e093be35-bb62-4843-b2e8-094545761610-kube-api-access-pddnv\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.015157 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9e9b5059-1b3e-4067-a63d-2952cbe863af-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.015242 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6g4lr\" (UniqueName: \"kubernetes.io/projected/f7e2c886-118e-43bb-bef1-c78134de392b-kube-api-access-6g4lr\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.015258 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/149b3c48-e17c-4a66-a835-d86dabf6ff13-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.015273 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w94wk\" (UniqueName: \"kubernetes.io/projected/01080b46-74f1-4191-8755-5152a57b3b25-kube-api-access-w94wk\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.015798 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9sq6c" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016468 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/736c54fe-349c-4bb9-870a-d1c1d1c03831-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016531 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rzt4w\" (UniqueName: \"kubernetes.io/projected/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-kube-api-access-rzt4w\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016547 5116 reconciler_common.go:299] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c491984c-7d4b-44aa-8c1e-d7974424fa47-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016561 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016584 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5f2bfad-70f6-4185-a3d9-81ce12720767-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016598 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vsz9\" (UniqueName: \"kubernetes.io/projected/c491984c-7d4b-44aa-8c1e-d7974424fa47-kube-api-access-9vsz9\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016612 5116 reconciler_common.go:299] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a11a02-47e1-488f-b270-2679d3298b0e-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016625 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016647 5116 reconciler_common.go:299] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ee8fbd3-1f81-4666-96da-5afc70819f1a-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016665 5116 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016678 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4tqq\" (UniqueName: \"kubernetes.io/projected/6ee8fbd3-1f81-4666-96da-5afc70819f1a-kube-api-access-d4tqq\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016692 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e8f42f-dc0e-424b-bb56-5ec849834888-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016709 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016721 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d565531a-ff86-4608-9d19-767de01ac31b-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016737 5116 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016756 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnxbn\" (UniqueName: \"kubernetes.io/projected/ce090a97-9ab6-4c40-a719-64ff2acd9778-kube-api-access-xnxbn\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016768 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016782 5116 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016796 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016812 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c491984c-7d4b-44aa-8c1e-d7974424fa47-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016832 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016845 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xfp5s\" (UniqueName: \"kubernetes.io/projected/cc85e424-18b2-4924-920b-bd291a8c4b01-kube-api-access-xfp5s\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016861 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016878 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016890 5116 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016902 5116 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a52afe44-fb37-46ed-a1f8-bf39727a3cbe-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016914 5116 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9f71a554-e414-4bc3-96d2-674060397afe-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016930 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfzkj\" (UniqueName: \"kubernetes.io/projected/0effdbcf-dd7d-404d-9d48-77536d665a5d-kube-api-access-mfzkj\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016943 5116 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-ca-trust-extracted-pem\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016955 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016971 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016983 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dztfv\" (UniqueName: \"kubernetes.io/projected/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-kube-api-access-dztfv\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.016995 5116 reconciler_common.go:299] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017008 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wj4qr\" (UniqueName: \"kubernetes.io/projected/149b3c48-e17c-4a66-a835-d86dabf6ff13-kube-api-access-wj4qr\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017024 5116 reconciler_common.go:299] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017036 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5f2bfad-70f6-4185-a3d9-81ce12720767-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017049 5116 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a7a88189-c967-4640-879e-27665747f20c-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017061 5116 reconciler_common.go:299] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017081 5116 reconciler_common.go:299] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18f80adb-c1c3-49ba-8ee4-932c851d3897-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017094 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017106 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zsb9b\" (UniqueName: \"kubernetes.io/projected/09cfa50b-4138-4585-a53e-64dd3ab73335-kube-api-access-zsb9b\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017122 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017133 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017145 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65c0ac1-8bca-454d-a2e6-e35cb418beac-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017156 5116 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017195 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017208 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d7e8f42f-dc0e-424b-bb56-5ec849834888-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017224 5116 reconciler_common.go:299] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017236 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9f71a554-e414-4bc3-96d2-674060397afe-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017252 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017268 5116 reconciler_common.go:299] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017282 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017296 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017313 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017325 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nb9c\" (UniqueName: \"kubernetes.io/projected/6edfcf45-925b-4eff-b940-95b6fc0b85d4-kube-api-access-8nb9c\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017338 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m26jq\" (UniqueName: \"kubernetes.io/projected/567683bd-0efc-4f21-b076-e28559628404-kube-api-access-m26jq\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017355 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017368 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8pskd\" (UniqueName: \"kubernetes.io/projected/a555ff2e-0be6-46d5-897d-863bb92ae2b3-kube-api-access-8pskd\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017380 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-grwfz\" (UniqueName: \"kubernetes.io/projected/31fa8943-81cc-4750-a0b7-0fa9ab5af883-kube-api-access-grwfz\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017391 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddlk9\" (UniqueName: \"kubernetes.io/projected/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-kube-api-access-ddlk9\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017429 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017676 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d7cps\" (UniqueName: \"kubernetes.io/projected/af41de71-79cf-4590-bbe9-9e8b848862cb-kube-api-access-d7cps\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017704 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ws8zz\" (UniqueName: \"kubernetes.io/projected/9e9b5059-1b3e-4067-a63d-2952cbe863af-kube-api-access-ws8zz\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017726 5116 reconciler_common.go:299] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017755 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/736c54fe-349c-4bb9-870a-d1c1d1c03831-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017776 5116 reconciler_common.go:299] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7df94c10-441d-4386-93a6-6730fb7bcde0-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017800 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.017821 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/92dfbade-90b6-4169-8c07-72cff7f2c82b-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018137 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018208 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ftwb6\" (UniqueName: \"kubernetes.io/projected/9f71a554-e414-4bc3-96d2-674060397afe-kube-api-access-ftwb6\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018230 5116 reconciler_common.go:299] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/16bdd140-dce1-464c-ab47-dd5798d1d256-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018249 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b605f283-6f2e-42da-a838-54421690f7d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018269 5116 reconciler_common.go:299] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/81e39f7b-62e4-4fc9-992a-6535ce127a02-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018361 5116 reconciler_common.go:299] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a208c9c2-333b-4b4a-be0d-bc32ec38a821-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018387 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7df94c10-441d-4386-93a6-6730fb7bcde0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018411 5116 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018437 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qqbfk\" (UniqueName: \"kubernetes.io/projected/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-kube-api-access-qqbfk\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018457 5116 reconciler_common.go:299] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018474 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01080b46-74f1-4191-8755-5152a57b3b25-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018493 5116 reconciler_common.go:299] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018516 5116 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5ebfebf6-3ecd-458e-943f-bb25b52e2718-serviceca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018534 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018555 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nmmzf\" (UniqueName: \"kubernetes.io/projected/7df94c10-441d-4386-93a6-6730fb7bcde0-kube-api-access-nmmzf\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018575 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qgrkj\" (UniqueName: \"kubernetes.io/projected/42a11a02-47e1-488f-b270-2679d3298b0e-kube-api-access-qgrkj\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018599 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018617 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-99zj9\" (UniqueName: \"kubernetes.io/projected/d565531a-ff86-4608-9d19-767de01ac31b-kube-api-access-99zj9\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018637 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4g8ts\" (UniqueName: \"kubernetes.io/projected/92dfbade-90b6-4169-8c07-72cff7f2c82b-kube-api-access-4g8ts\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018659 5116 reconciler_common.go:299] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018682 5116 reconciler_common.go:299] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018699 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twvbl\" (UniqueName: \"kubernetes.io/projected/b4750666-1362-4001-abd0-6f89964cc621-kube-api-access-twvbl\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018719 5116 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92dfbade-90b6-4169-8c07-72cff7f2c82b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018743 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbc2l\" (UniqueName: \"kubernetes.io/projected/593a3561-7760-45c5-8f91-5aaef7475d0f-kube-api-access-sbc2l\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018762 5116 reconciler_common.go:299] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018780 5116 reconciler_common.go:299] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018798 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f7e2c886-118e-43bb-bef1-c78134de392b-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018821 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018839 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94a6e063-3d1a-4d44-875d-185291448c31-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018856 5116 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d7e8f42f-dc0e-424b-bb56-5ec849834888-service-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018875 5116 reconciler_common.go:299] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018901 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7afa918d-be67-40a6-803c-d3b0ae99d815-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018919 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018949 5116 reconciler_common.go:299] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18f80adb-c1c3-49ba-8ee4-932c851d3897-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018968 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5lcfw\" (UniqueName: \"kubernetes.io/projected/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9-kube-api-access-5lcfw\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.018990 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019012 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019032 5116 reconciler_common.go:299] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019056 5116 reconciler_common.go:299] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019075 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hckvg\" (UniqueName: \"kubernetes.io/projected/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-kube-api-access-hckvg\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019094 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2325ffef-9d5b-447f-b00e-3efc429acefe-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019113 5116 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9e9b5059-1b3e-4067-a63d-2952cbe863af-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019135 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5lgh\" (UniqueName: \"kubernetes.io/projected/d19cb085-0c5b-4810-b654-ce7923221d90-kube-api-access-m5lgh\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019155 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zth6t\" (UniqueName: \"kubernetes.io/projected/6077b63e-53a2-4f96-9d56-1ce0324e4913-kube-api-access-zth6t\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019197 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019215 5116 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019238 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pllx6\" (UniqueName: \"kubernetes.io/projected/81e39f7b-62e4-4fc9-992a-6535ce127a02-kube-api-access-pllx6\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019256 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.019275 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/567683bd-0efc-4f21-b076-e28559628404-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.021079 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31fa8943-81cc-4750-a0b7-0fa9ab5af883-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.021561 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1811891e-33d0-4500-a481-0e4aa2d3e95c-serviceca\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.023188 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.025283 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.026926 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qp9r\" (UniqueName: \"kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r\") pod \"ovnkube-node-n9zvq\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029536 5116 reconciler_common.go:299] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-images\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029569 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zg8nc\" (UniqueName: \"kubernetes.io/projected/2325ffef-9d5b-447f-b00e-3efc429acefe-kube-api-access-zg8nc\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029587 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7599e0b6-bddf-4def-b7f2-0b32206e8651-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029600 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f65c0ac1-8bca-454d-a2e6-e35cb418beac-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029611 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f65c0ac1-8bca-454d-a2e6-e35cb418beac-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029652 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029663 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/584e1f4a-8205-47d7-8efb-3afc6017c4c9-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029675 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/567683bd-0efc-4f21-b076-e28559628404-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029686 5116 reconciler_common.go:299] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029697 5116 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6077b63e-53a2-4f96-9d56-1ce0324e4913-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029708 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7599e0b6-bddf-4def-b7f2-0b32206e8651-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029720 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a555ff2e-0be6-46d5-897d-863bb92ae2b3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029833 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4smf\" (UniqueName: \"kubernetes.io/projected/0dd0fbac-8c0d-4228-8faa-abbeedabf7db-kube-api-access-q4smf\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029845 5116 reconciler_common.go:299] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ce090a97-9ab6-4c40-a719-64ff2acd9778-signing-key\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029855 5116 reconciler_common.go:299] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029866 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hb7m\" (UniqueName: \"kubernetes.io/projected/94a6e063-3d1a-4d44-875d-185291448c31-kube-api-access-4hb7m\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029877 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029888 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6077b63e-53a2-4f96-9d56-1ce0324e4913-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029899 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029912 5116 reconciler_common.go:299] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4750666-1362-4001-abd0-6f89964cc621-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029926 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f2bfad-70f6-4185-a3d9-81ce12720767-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029936 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxfcv\" (UniqueName: \"kubernetes.io/projected/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff-kube-api-access-xxfcv\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029945 5116 reconciler_common.go:299] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7a88189-c967-4640-879e-27665747f20c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029954 5116 reconciler_common.go:299] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/301e1965-1754-483d-b6cc-bfae7038bbca-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029964 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9z4sw\" (UniqueName: \"kubernetes.io/projected/e1d2a42d-af1d-4054-9618-ab545e0ed8b7-kube-api-access-9z4sw\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029974 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029984 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l87hs\" (UniqueName: \"kubernetes.io/projected/5ebfebf6-3ecd-458e-943f-bb25b52e2718-kube-api-access-l87hs\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.029995 5116 reconciler_common.go:299] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc8db2c7-859d-47b3-a900-2bd0c0b2973b-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030005 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hm9x7\" (UniqueName: \"kubernetes.io/projected/f559dfa3-3917-43a2-97f6-61ddfda10e93-kube-api-access-hm9x7\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030015 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc85e424-18b2-4924-920b-bd291a8c4b01-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030028 5116 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-service-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030041 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4750666-1362-4001-abd0-6f89964cc621-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030051 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-94l9h\" (UniqueName: \"kubernetes.io/projected/16bdd140-dce1-464c-ab47-dd5798d1d256-kube-api-access-94l9h\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030062 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a555ff2e-0be6-46d5-897d-863bb92ae2b3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030098 5116 reconciler_common.go:299] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-audit\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030127 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f559dfa3-3917-43a2-97f6-61ddfda10e93-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030158 5116 reconciler_common.go:299] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af33e427-6803-48c2-a76a-dd9deb7cbf9a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030207 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tknt7\" (UniqueName: \"kubernetes.io/projected/584e1f4a-8205-47d7-8efb-3afc6017c4c9-kube-api-access-tknt7\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030217 5116 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030228 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mjwtd\" (UniqueName: \"kubernetes.io/projected/869851b9-7ffb-4af0-b166-1d8aa40a5f80-kube-api-access-mjwtd\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030238 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2325ffef-9d5b-447f-b00e-3efc429acefe-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030356 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgx6b\" (UniqueName: \"kubernetes.io/projected/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4-kube-api-access-pgx6b\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030371 5116 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af33e427-6803-48c2-a76a-dd9deb7cbf9a-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030384 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wbmqg\" (UniqueName: \"kubernetes.io/projected/18f80adb-c1c3-49ba-8ee4-932c851d3897-kube-api-access-wbmqg\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030398 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030409 5116 reconciler_common.go:299] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/593a3561-7760-45c5-8f91-5aaef7475d0f-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030420 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/567683bd-0efc-4f21-b076-e28559628404-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030431 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/736c54fe-349c-4bb9-870a-d1c1d1c03831-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030443 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01080b46-74f1-4191-8755-5152a57b3b25-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030454 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z5rsr\" (UniqueName: \"kubernetes.io/projected/af33e427-6803-48c2-a76a-dd9deb7cbf9a-kube-api-access-z5rsr\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030466 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26xrl\" (UniqueName: \"kubernetes.io/projected/a208c9c2-333b-4b4a-be0d-bc32ec38a821-kube-api-access-26xrl\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030481 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6edfcf45-925b-4eff-b940-95b6fc0b85d4-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030579 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058dd31f-b3ad-4ab1-a174-760d8eb305f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://52601b921664be4e939b691ab6a7a52e7865f01815701526f277de4bf21520e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030767 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzzhb\" (UniqueName: \"kubernetes.io/projected/94c19a90-c2c9-4236-98be-a0516dbb840b-kube-api-access-lzzhb\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030834 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.030969 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" event={"ID":"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc","Type":"ContainerStarted","Data":"269e063a36b7b7fc11164fcc08c4e2be795a7587abb86cc3cd3059814b1428ed"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031089 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031105 5116 reconciler_common.go:299] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d565531a-ff86-4608-9d19-767de01ac31b-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031116 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031126 5116 reconciler_common.go:299] "Volume detached for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-whereabouts-flatfile-configmap\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031137 5116 reconciler_common.go:299] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031150 5116 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9e9b5059-1b3e-4067-a63d-2952cbe863af-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031181 5116 reconciler_common.go:299] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/869851b9-7ffb-4af0-b166-1d8aa40a5f80-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031194 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d19cb085-0c5b-4810-b654-ce7923221d90-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031203 5116 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f559dfa3-3917-43a2-97f6-61ddfda10e93-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031214 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09cfa50b-4138-4585-a53e-64dd3ab73335-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031223 5116 reconciler_common.go:299] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92dfbade-90b6-4169-8c07-72cff7f2c82b-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.031234 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l9stx\" (UniqueName: \"kubernetes.io/projected/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca-kube-api-access-l9stx\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.032251 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" event={"ID":"34177974-8d82-49d2-a763-391d0df3bbd8","Type":"ContainerStarted","Data":"e2e86fbd43894a1b6caf7d25e4e7fdfc49ae025f450dc0712baf376e54c613b2"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.033746 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntdv4\" (UniqueName: \"kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4\") pod \"ovnkube-control-plane-57b78d8988-bd7p4\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.037649 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"f01464e4e01b07766168893ba8966ef163e5529f1b9bf0bd071a9fcee59ea506"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.037932 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qscfn\" (UniqueName: \"kubernetes.io/projected/1811891e-33d0-4500-a481-0e4aa2d3e95c-kube-api-access-qscfn\") pod \"node-ca-2rwjp\" (UID: \"1811891e-33d0-4500-a481-0e4aa2d3e95c\") " pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.041426 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.041576 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwnjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36c245b-3d7f-48eb-848e-c54198ae38a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwnjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.041629 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.041992 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.042306 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.042354 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7gh6\" (UniqueName: \"kubernetes.io/projected/68dcbc21-b4ce-4285-9a4b-101724f82f33-kube-api-access-b7gh6\") pod \"multus-additional-cni-plugins-bk75f\" (UID: \"68dcbc21-b4ce-4285-9a4b-101724f82f33\") " pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.042636 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.055528 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0911025-04f5-4040-a72c-14769d03d8e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2d70ad1aa005302b2a06b86cd4b8f4df7506345c8e11f68b51f077be50023120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://497485a3b5940835bde037d81157c05ecaaa45b09bdffff76bbc038694f328d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dac49ea656d097873f0dcd29b532de929dd705148709a0184938669a24b2d0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.055853 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.068541 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.073024 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.078913 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2rwjp" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.087568 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.096613 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.099642 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.108789 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94c19a90-c2c9-4236-98be-a0516dbb840b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlq8c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:23 crc kubenswrapper[5116]: W0322 00:10:23.120030 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1811891e_33d0_4500_a481_0e4aa2d3e95c.slice/crio-16c3a69ba39325497a09a22f1ec9fc81921bd2fcfc5191e8acd8461faf4f32d5 WatchSource:0}: Error finding container 16c3a69ba39325497a09a22f1ec9fc81921bd2fcfc5191e8acd8461faf4f32d5: Status 404 returned error can't find the container with id 16c3a69ba39325497a09a22f1ec9fc81921bd2fcfc5191e8acd8461faf4f32d5 Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.123825 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bk75f" Mar 22 00:10:23 crc kubenswrapper[5116]: W0322 00:10:23.141405 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec484e57_1508_45a3_99a3_51dfa8ef6195.slice/crio-43ef6e49a2566cb4645e9989226efbcb7eed21688ac9c6361a1e719ed88b50a9 WatchSource:0}: Error finding container 43ef6e49a2566cb4645e9989226efbcb7eed21688ac9c6361a1e719ed88b50a9: Status 404 returned error can't find the container with id 43ef6e49a2566cb4645e9989226efbcb7eed21688ac9c6361a1e719ed88b50a9 Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.147227 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.147280 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.147292 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.147311 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.147322 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.252138 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.252251 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.252269 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.252286 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.252301 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.334333 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.334377 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.334408 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.334451 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334497 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334553 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.334538441 +0000 UTC m=+95.356839814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334871 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334893 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334924 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.334916563 +0000 UTC m=+95.357217926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334894 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334940 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.334960 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.334954495 +0000 UTC m=+95.357255868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.335525 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.335560 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.335577 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.335648 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.335628266 +0000 UTC m=+95.357929709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.354898 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.354933 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.354941 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.354956 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.354967 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.434910 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.435081 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.435054711 +0000 UTC m=+95.457356084 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.457982 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.458034 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.458046 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.458063 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.458076 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.535919 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.536079 5116 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: E0322 00:10:23.536142 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs podName:94c19a90-c2c9-4236-98be-a0516dbb840b nodeName:}" failed. No retries permitted until 2026-03-22 00:10:24.536126518 +0000 UTC m=+95.558427891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs") pod "network-metrics-daemon-wlq8c" (UID: "94c19a90-c2c9-4236-98be-a0516dbb840b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.560776 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.560810 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.560819 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.560834 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.560857 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.663613 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.663921 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.663939 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.663959 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.663973 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.704238 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01080b46-74f1-4191-8755-5152a57b3b25" path="/var/lib/kubelet/pods/01080b46-74f1-4191-8755-5152a57b3b25/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.705017 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09cfa50b-4138-4585-a53e-64dd3ab73335" path="/var/lib/kubelet/pods/09cfa50b-4138-4585-a53e-64dd3ab73335/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.725821 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd0fbac-8c0d-4228-8faa-abbeedabf7db" path="/var/lib/kubelet/pods/0dd0fbac-8c0d-4228-8faa-abbeedabf7db/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.733639 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0effdbcf-dd7d-404d-9d48-77536d665a5d" path="/var/lib/kubelet/pods/0effdbcf-dd7d-404d-9d48-77536d665a5d/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.752683 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149b3c48-e17c-4a66-a835-d86dabf6ff13" path="/var/lib/kubelet/pods/149b3c48-e17c-4a66-a835-d86dabf6ff13/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.764469 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16bdd140-dce1-464c-ab47-dd5798d1d256" path="/var/lib/kubelet/pods/16bdd140-dce1-464c-ab47-dd5798d1d256/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.765959 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18f80adb-c1c3-49ba-8ee4-932c851d3897" path="/var/lib/kubelet/pods/18f80adb-c1c3-49ba-8ee4-932c851d3897/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.767499 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.767545 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.767558 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.767574 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.767601 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.773596 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ce4d18-fe25-4696-ad7c-1bd2d6200a3e" path="/var/lib/kubelet/pods/20ce4d18-fe25-4696-ad7c-1bd2d6200a3e/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.774591 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2325ffef-9d5b-447f-b00e-3efc429acefe" path="/var/lib/kubelet/pods/2325ffef-9d5b-447f-b00e-3efc429acefe/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.778377 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301e1965-1754-483d-b6cc-bfae7038bbca" path="/var/lib/kubelet/pods/301e1965-1754-483d-b6cc-bfae7038bbca/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.781094 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31fa8943-81cc-4750-a0b7-0fa9ab5af883" path="/var/lib/kubelet/pods/31fa8943-81cc-4750-a0b7-0fa9ab5af883/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.787735 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42a11a02-47e1-488f-b270-2679d3298b0e" path="/var/lib/kubelet/pods/42a11a02-47e1-488f-b270-2679d3298b0e/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.788841 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567683bd-0efc-4f21-b076-e28559628404" path="/var/lib/kubelet/pods/567683bd-0efc-4f21-b076-e28559628404/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.809466 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="584e1f4a-8205-47d7-8efb-3afc6017c4c9" path="/var/lib/kubelet/pods/584e1f4a-8205-47d7-8efb-3afc6017c4c9/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.810053 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="593a3561-7760-45c5-8f91-5aaef7475d0f" path="/var/lib/kubelet/pods/593a3561-7760-45c5-8f91-5aaef7475d0f/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.813696 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ebfebf6-3ecd-458e-943f-bb25b52e2718" path="/var/lib/kubelet/pods/5ebfebf6-3ecd-458e-943f-bb25b52e2718/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.815207 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6077b63e-53a2-4f96-9d56-1ce0324e4913" path="/var/lib/kubelet/pods/6077b63e-53a2-4f96-9d56-1ce0324e4913/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.817467 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca" path="/var/lib/kubelet/pods/6a81eec9-f29e-49a0-a15a-f2f5bd2d95ca/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.825917 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6edfcf45-925b-4eff-b940-95b6fc0b85d4" path="/var/lib/kubelet/pods/6edfcf45-925b-4eff-b940-95b6fc0b85d4/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.837672 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee8fbd3-1f81-4666-96da-5afc70819f1a" path="/var/lib/kubelet/pods/6ee8fbd3-1f81-4666-96da-5afc70819f1a/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.841502 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a" path="/var/lib/kubelet/pods/71c8ffbe-59c6-4e7d-aa1a-bbd315b3414a/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.851825 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="736c54fe-349c-4bb9-870a-d1c1d1c03831" path="/var/lib/kubelet/pods/736c54fe-349c-4bb9-870a-d1c1d1c03831/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.854745 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7599e0b6-bddf-4def-b7f2-0b32206e8651" path="/var/lib/kubelet/pods/7599e0b6-bddf-4def-b7f2-0b32206e8651/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.862699 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7afa918d-be67-40a6-803c-d3b0ae99d815" path="/var/lib/kubelet/pods/7afa918d-be67-40a6-803c-d3b0ae99d815/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.864268 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df94c10-441d-4386-93a6-6730fb7bcde0" path="/var/lib/kubelet/pods/7df94c10-441d-4386-93a6-6730fb7bcde0/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.867680 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcc6409-8a0f-44c3-89e7-5aecd7610f8a" path="/var/lib/kubelet/pods/7fcc6409-8a0f-44c3-89e7-5aecd7610f8a/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.871781 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.871858 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.871875 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.871905 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.871926 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.879527 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e39f7b-62e4-4fc9-992a-6535ce127a02" path="/var/lib/kubelet/pods/81e39f7b-62e4-4fc9-992a-6535ce127a02/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.880684 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869851b9-7ffb-4af0-b166-1d8aa40a5f80" path="/var/lib/kubelet/pods/869851b9-7ffb-4af0-b166-1d8aa40a5f80/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.888778 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff" path="/var/lib/kubelet/pods/9276f8f5-2f24-48e1-ab6d-1aab0d8ec3ff/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.889493 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92dfbade-90b6-4169-8c07-72cff7f2c82b" path="/var/lib/kubelet/pods/92dfbade-90b6-4169-8c07-72cff7f2c82b/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.895196 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a6e063-3d1a-4d44-875d-185291448c31" path="/var/lib/kubelet/pods/94a6e063-3d1a-4d44-875d-185291448c31/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.897613 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f71a554-e414-4bc3-96d2-674060397afe" path="/var/lib/kubelet/pods/9f71a554-e414-4bc3-96d2-674060397afe/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.909538 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a208c9c2-333b-4b4a-be0d-bc32ec38a821" path="/var/lib/kubelet/pods/a208c9c2-333b-4b4a-be0d-bc32ec38a821/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.912495 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52afe44-fb37-46ed-a1f8-bf39727a3cbe" path="/var/lib/kubelet/pods/a52afe44-fb37-46ed-a1f8-bf39727a3cbe/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.914473 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a555ff2e-0be6-46d5-897d-863bb92ae2b3" path="/var/lib/kubelet/pods/a555ff2e-0be6-46d5-897d-863bb92ae2b3/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.915259 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a88189-c967-4640-879e-27665747f20c" path="/var/lib/kubelet/pods/a7a88189-c967-4640-879e-27665747f20c/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.919263 5116 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="af33e427-6803-48c2-a76a-dd9deb7cbf9a" path="/var/lib/kubelet/pods/af33e427-6803-48c2-a76a-dd9deb7cbf9a/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.919527 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af33e427-6803-48c2-a76a-dd9deb7cbf9a" path="/var/lib/kubelet/pods/af33e427-6803-48c2-a76a-dd9deb7cbf9a/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.927549 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af41de71-79cf-4590-bbe9-9e8b848862cb" path="/var/lib/kubelet/pods/af41de71-79cf-4590-bbe9-9e8b848862cb/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.931896 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a" path="/var/lib/kubelet/pods/b05a4c1d-fa93-4d3d-b6e5-235473e1ae2a/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.938109 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4750666-1362-4001-abd0-6f89964cc621" path="/var/lib/kubelet/pods/b4750666-1362-4001-abd0-6f89964cc621/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.943843 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b605f283-6f2e-42da-a838-54421690f7d0" path="/var/lib/kubelet/pods/b605f283-6f2e-42da-a838-54421690f7d0/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.944432 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c491984c-7d4b-44aa-8c1e-d7974424fa47" path="/var/lib/kubelet/pods/c491984c-7d4b-44aa-8c1e-d7974424fa47/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.946056 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f2bfad-70f6-4185-a3d9-81ce12720767" path="/var/lib/kubelet/pods/c5f2bfad-70f6-4185-a3d9-81ce12720767/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.946973 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc85e424-18b2-4924-920b-bd291a8c4b01" path="/var/lib/kubelet/pods/cc85e424-18b2-4924-920b-bd291a8c4b01/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.947504 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce090a97-9ab6-4c40-a719-64ff2acd9778" path="/var/lib/kubelet/pods/ce090a97-9ab6-4c40-a719-64ff2acd9778/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.949145 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19cb085-0c5b-4810-b654-ce7923221d90" path="/var/lib/kubelet/pods/d19cb085-0c5b-4810-b654-ce7923221d90/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.951161 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d45be74c-0d98-4d18-90e4-f7ef1b6daaf7" path="/var/lib/kubelet/pods/d45be74c-0d98-4d18-90e4-f7ef1b6daaf7/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.953008 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d565531a-ff86-4608-9d19-767de01ac31b" path="/var/lib/kubelet/pods/d565531a-ff86-4608-9d19-767de01ac31b/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.954415 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e8f42f-dc0e-424b-bb56-5ec849834888" path="/var/lib/kubelet/pods/d7e8f42f-dc0e-424b-bb56-5ec849834888/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.957353 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9" path="/var/lib/kubelet/pods/dcd10325-9ba5-4a3b-8e4a-e57e3bf210f9/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.961420 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e093be35-bb62-4843-b2e8-094545761610" path="/var/lib/kubelet/pods/e093be35-bb62-4843-b2e8-094545761610/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.962231 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d2a42d-af1d-4054-9618-ab545e0ed8b7" path="/var/lib/kubelet/pods/e1d2a42d-af1d-4054-9618-ab545e0ed8b7/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.963739 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f559dfa3-3917-43a2-97f6-61ddfda10e93" path="/var/lib/kubelet/pods/f559dfa3-3917-43a2-97f6-61ddfda10e93/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.971139 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65c0ac1-8bca-454d-a2e6-e35cb418beac" path="/var/lib/kubelet/pods/f65c0ac1-8bca-454d-a2e6-e35cb418beac/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.972503 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4" path="/var/lib/kubelet/pods/f7648cbb-48eb-4ba8-87ec-eb096b8fa1e4/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.974427 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e2c886-118e-43bb-bef1-c78134de392b" path="/var/lib/kubelet/pods/f7e2c886-118e-43bb-bef1-c78134de392b/volumes" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.975687 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.975739 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.975751 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.975768 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.975779 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:23Z","lastTransitionTime":"2026-03-22T00:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:23 crc kubenswrapper[5116]: I0322 00:10:23.976792 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8db2c7-859d-47b3-a900-2bd0c0b2973b" path="/var/lib/kubelet/pods/fc8db2c7-859d-47b3-a900-2bd0c0b2973b/volumes" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.043372 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerStarted","Data":"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.043430 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerStarted","Data":"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.043448 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerStarted","Data":"7eadfb4600290cb56b95da12f03d4c885e0344117c4889ec529ea4aaac7dd7ce"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.047500 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"4f407ddd10876e1a8c14d096fd93218ee2e273d2e13b40b7f47ac97e4f7577c8"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.047552 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" event={"ID":"fc4541ce-7789-4670-bc75-5c2868e52ce0","Type":"ContainerStarted","Data":"57fbcaaf231fb8acf6f6165c6611af0cf0d6cf135ea43f7a7481a2124db8ed43"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.049540 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerStarted","Data":"3a773f1e36416b2c124cfa148df2f80bb14dcce04409d501d212d4552fb6fdab"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.049599 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerStarted","Data":"a55e076fcabd345547e41146fd9e4a751e99d82dd2f925d7e04407dbcd5c1367"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.053159 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" exitCode=0 Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.053191 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.053334 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"43ef6e49a2566cb4645e9989226efbcb7eed21688ac9c6361a1e719ed88b50a9"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.055787 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"1d1e21356eefd317bcf9ab8691392e59de36a5c78a5c0e2abe762755ab45df92"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.055821 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.055834 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"b7f3005da76032ee4dab81b7f41c5734f96c4b93a470a0cb1ed78aa7bf231102"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.058266 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9sq6c" event={"ID":"5188f25b-37c3-46f1-b939-199c6e082848","Type":"ContainerStarted","Data":"15ed776a73c12ffc79727de77156edc740c2234810a94e17ee8fc99d259db9c0"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.058310 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9sq6c" event={"ID":"5188f25b-37c3-46f1-b939-199c6e082848","Type":"ContainerStarted","Data":"29c4f6cffa6dd6b4a39e497ff4f5ad41df8eb7d4e7d5b313b0b79363eb66c70c"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.058365 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-66g6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.060686 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nwnjb" event={"ID":"d36c245b-3d7f-48eb-848e-c54198ae38a4","Type":"ContainerStarted","Data":"b9bc4199ef1749e4a84f489f40e0390b5b26d91adc18920dda60b81ab52cadeb"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.060749 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nwnjb" event={"ID":"d36c245b-3d7f-48eb-848e-c54198ae38a4","Type":"ContainerStarted","Data":"535e6dea8944ee8c6d28f33eb0fb646fa6fc219a9d2f1e217232b2b43b2fcbfb"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.062747 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2rwjp" event={"ID":"1811891e-33d0-4500-a481-0e4aa2d3e95c","Type":"ContainerStarted","Data":"f51b492b0338abc8e31fd29413b4b5d43f33d541be8a08e10f8ebe50a03fb477"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.062786 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2rwjp" event={"ID":"1811891e-33d0-4500-a481-0e4aa2d3e95c","Type":"ContainerStarted","Data":"16c3a69ba39325497a09a22f1ec9fc81921bd2fcfc5191e8acd8461faf4f32d5"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.064405 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" event={"ID":"34177974-8d82-49d2-a763-391d0df3bbd8","Type":"ContainerStarted","Data":"b8e5f28015a7839919e338305c619f5b1210d5514ab180e75985ccbd323c7240"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.075092 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec484e57-1508-45a3-99a3-51dfa8ef6195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n9zvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.079791 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.079840 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.079853 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.079873 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.079889 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.087039 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe8361c-9ce7-48cd-9142-ae635e1b27d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5f2712849d12be66837cdb73e0858adfc9b9172ec8bbe1d83c7fe3fcc4bc8fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c2b67442f55addc55a7f281f1b6a6441e1d6068e6f826dedb686cdc29ef2ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://609c703ea4eb165b2fdc69a88571190199d2dd374ea3629e50d7e6091c4552b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.115839 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b44f09af-f7e2-4bcb-bdba-55ff8b81f5de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://5c4690216bec52133687895e582b4bb77c5e98aa64b321ad9c9d506f55d3f00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2600f5226a6713d12505ffd457e56ebd7d088161c72b9c8559bbea8a975297de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://79bb9087b191778249a1a55111db0643e3b60d19f757f975df02d11b33c9c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://bfe93ca3eb874b172d7e5a437f62f0a3794ec8d1de985eadc78f9c5dc076e580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://88776b5f47aa00647e827ac8c99416f29940ce9d6589329425a92597ab26ec51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:53Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.129199 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.140348 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.150048 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-9sq6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5188f25b-37c3-46f1-b939-199c6e082848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9sq6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.159331 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17ab744-68a7-4a24-8ef2-556696d752fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"},\\\"containerID\\\":\\\"cri-o://6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"},\\\"containerID\\\":\\\"cri-o://ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-bd7p4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.169549 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2rwjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1811891e-33d0-4500-a481-0e4aa2d3e95c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qscfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2rwjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.183114 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.183186 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.183202 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.183222 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.183235 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.184834 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dcbc21-b4ce-4285-9a4b-101724f82f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bk75f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.195969 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058dd31f-b3ad-4ab1-a174-760d8eb305f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://52601b921664be4e939b691ab6a7a52e7865f01815701526f277de4bf21520e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.207647 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwnjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36c245b-3d7f-48eb-848e-c54198ae38a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwnjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.222871 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0911025-04f5-4040-a72c-14769d03d8e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2d70ad1aa005302b2a06b86cd4b8f4df7506345c8e11f68b51f077be50023120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://497485a3b5940835bde037d81157c05ecaaa45b09bdffff76bbc038694f328d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dac49ea656d097873f0dcd29b532de929dd705148709a0184938669a24b2d0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.236044 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.248992 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.261738 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fbdfe828b092b23e6d4480daf3e0216aada6debaf1ef1b314a0a31e73ebf13c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-5ff7774fd9-nljh6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.270880 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94c19a90-c2c9-4236-98be-a0516dbb840b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlq8c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.283026 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-22T00:09:59Z\\\",\\\"message\\\":\\\"ar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"ClientsPreferCBOR\\\\\\\" enabled=false\\\\nW0322 00:09:59.459991 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0322 00:09:59.460237 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0322 00:09:59.461209 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2974540120/tls.crt::/tmp/serving-cert-2974540120/tls.key\\\\\\\"\\\\nI0322 00:09:59.961022 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0322 00:09:59.962668 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0322 00:09:59.962684 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0322 00:09:59.962706 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0322 00:09:59.962712 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0322 00:09:59.967903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0322 00:09:59.967930 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967945 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0322 00:09:59.967951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0322 00:09:59.967956 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0322 00:09:59.967963 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0322 00:09:59.967953 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0322 00:09:59.969056 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-22T00:09:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.285611 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.285649 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.285664 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.285681 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.285694 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.298267 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.307480 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94c19a90-c2c9-4236-98be-a0516dbb840b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:49b34ce0d25eec7a6077f4bf21bf7d4e64e598d28785a20b9ee3594423b7de14\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzzhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wlq8c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.321807 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"},\\\"containerID\\\":\\\"cri-o://ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"265m\\\",\\\"memory\\\":\\\"1Gi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-bundle-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-22T00:09:59Z\\\",\\\"message\\\":\\\"ar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"ClientsPreferCBOR\\\\\\\" enabled=false\\\\nW0322 00:09:59.459991 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0322 00:09:59.460237 1 builder.go:304] check-endpoints version v0.0.0-unknown-c3d9642-c3d9642\\\\nI0322 00:09:59.461209 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2974540120/tls.crt::/tmp/serving-cert-2974540120/tls.key\\\\\\\"\\\\nI0322 00:09:59.961022 1 requestheader_controller.go:255] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0322 00:09:59.962668 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0322 00:09:59.962684 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0322 00:09:59.962706 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0322 00:09:59.962712 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0322 00:09:59.967903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0322 00:09:59.967930 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0322 00:09:59.967945 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0322 00:09:59.967951 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0322 00:09:59.967956 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0322 00:09:59.967963 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0322 00:09:59.967953 1 genericapiserver.go:546] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0322 00:09:59.969056 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-22T00:09:58Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.334763 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dsgwk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-5jnd7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.344664 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.344712 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.344738 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.344769 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.344922 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.344945 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.344970 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345032 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.345018675 +0000 UTC m=+97.367320048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345388 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345401 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345409 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345434 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.345426768 +0000 UTC m=+97.367728141 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345473 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345493 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.34548788 +0000 UTC m=+97.367789253 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345560 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.345590 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.345582633 +0000 UTC m=+97.367884006 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.349023 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1d1e21356eefd317bcf9ab8691392e59de36a5c78a5c0e2abe762755ab45df92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pkhq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-66g6d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.366293 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec484e57-1508-45a3-99a3-51dfa8ef6195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"resources\\\":{},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:10:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8qp9r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-n9zvq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.377190 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe8361c-9ce7-48cd-9142-ae635e1b27d2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://5f2712849d12be66837cdb73e0858adfc9b9172ec8bbe1d83c7fe3fcc4bc8fe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://8c2b67442f55addc55a7f281f1b6a6441e1d6068e6f826dedb686cdc29ef2ec1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://609c703ea4eb165b2fdc69a88571190199d2dd374ea3629e50d7e6091c4552b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"15m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1543246d672acb399d96f28680b91e7dcf741cbb8ce21363dd09dec6dad2687c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.387110 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.387149 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.387160 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.387197 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.387206 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.393657 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b44f09af-f7e2-4bcb-bdba-55ff8b81f5de\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"},\\\"containerID\\\":\\\"cri-o://5c4690216bec52133687895e582b4bb77c5e98aa64b321ad9c9d506f55d3f00e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"300m\\\",\\\"memory\\\":\\\"600Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2600f5226a6713d12505ffd457e56ebd7d088161c72b9c8559bbea8a975297de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"40m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://79bb9087b191778249a1a55111db0643e3b60d19f757f975df02d11b33c9c101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://bfe93ca3eb874b172d7e5a437f62f0a3794ec8d1de985eadc78f9c5dc076e580\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:54Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://88776b5f47aa00647e827ac8c99416f29940ce9d6589329425a92597ab26ec51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:53Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://18580e6f77485dbbf6c9384d7b066de66b1281944199ad05eb32241a77f1b8db\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd-auto-backup\\\",\\\"name\\\":\\\"etcd-auto-backup-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b8c90b6c961bb202b828dc8ac3d4da1dd89cbe82a5400089a87594b9666234de\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}}},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"},\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"60Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9fbf4e89cefa264fde65048c1a9bba1b6eafe0e57717eb8d9a23e9a445b8195\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.404000 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17b87002-b798-480a-8e17-83053d698239\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwt8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-fhkjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.414095 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-dgvkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc4541ce-7789-4670-bc75-5c2868e52ce0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://4f407ddd10876e1a8c14d096fd93218ee2e273d2e13b40b7f47ac97e4f7577c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0,1000500000],\\\"uid\\\":1000500000}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://57fbcaaf231fb8acf6f6165c6611af0cf0d6cf135ea43f7a7481a2124db8ed43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0,1000500000],\\\"uid\\\":1000500000}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nt2j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-dgvkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.425671 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-9sq6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5188f25b-37c3-46f1-b939-199c6e082848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"65Mi\\\"},\\\"containerID\\\":\\\"cri-o://15ed776a73c12ffc79727de77156edc740c2234810a94e17ee8fc99d259db9c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"65Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqrqs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9sq6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.438328 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e17ab744-68a7-4a24-8ef2-556696d752fb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"},\\\"containerID\\\":\\\"cri-o://6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"20Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"},\\\"containerID\\\":\\\"cri-o://ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"300Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ntdv4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-57b78d8988-bd7p4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.448799 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-image-registry/node-ca-2rwjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1811891e-33d0-4500-a481-0e4aa2d3e95c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"10Mi\\\"},\\\"containerID\\\":\\\"cri-o://f51b492b0338abc8e31fd29413b4b5d43f33d541be8a08e10f8ebe50a03fb477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"10Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":1001}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qscfn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-2rwjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.449560 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.449761 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.449745119 +0000 UTC m=+97.472046492 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.464916 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"68dcbc21-b4ce-4285-9a4b-101724f82f33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a773f1e36416b2c124cfa148df2f80bb14dcce04409d501d212d4552fb6fdab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"resources\\\":{},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6efa070ceb93cc5fc2e76eab6d9c96ac3c4f8812085d0b6eb6e3f513b5bac782\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3454e762466e22e2a893650b9781823558bc6fdfda2aa4188aff3cb819014c4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/etc/whereabouts/config\\\",\\\"name\\\":\\\"whereabouts-flatfile-configmap\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7gh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bk75f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.476067 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"058dd31f-b3ad-4ab1-a174-760d8eb305f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://52601b921664be4e939b691ab6a7a52e7865f01815701526f277de4bf21520e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"20m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:52Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:16d5a229c172bde2f4238e8a88602fd6351d80b262f35484740a979d8b3567a5\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf2bdb501428b5446a15124d06a58c58c7cfb52a6b41e975ff1b3c08894e0cff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-22T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.485886 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-dns/node-resolver-nwnjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d36c245b-3d7f-48eb-848e-c54198ae38a4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"},\\\"containerID\\\":\\\"cri-o://b9bc4199ef1749e4a84f489f40e0390b5b26d91adc18920dda60b81ab52cadeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"21Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x5zcr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:10:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nwnjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.489540 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.489607 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.489627 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.489680 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.489698 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.497306 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0911025-04f5-4040-a72c-14769d03d8e0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-22T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://2d70ad1aa005302b2a06b86cd4b8f4df7506345c8e11f68b51f077be50023120\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"},\\\"containerID\\\":\\\"cri-o://05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"60m\\\",\\\"memory\\\":\\\"200Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:50Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://497485a3b5940835bde037d81157c05ecaaa45b09bdffff76bbc038694f328d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]},{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://dac49ea656d097873f0dcd29b532de929dd705148709a0184938669a24b2d0b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"5m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:08:51Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":0,\\\"supplementalGroups\\\":[0],\\\"uid\\\":0}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/tmp\\\",\\\"name\\\":\\\"tmp-dir\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem\\\",\\\"name\\\":\\\"ca-trust-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/kubernetes\\\",\\\"name\\\":\\\"var-run-kubernetes\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-22T00:08:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.510517 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-operator/network-operator-7bdcf4f5bd-7fjxv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34177974-8d82-49d2-a763-391d0df3bbd8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"allocatedResources\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"},\\\"containerID\\\":\\\"cri-o://b8e5f28015a7839919e338305c619f5b1210d5514ab180e75985ccbd323c7240\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"resources\\\":{\\\"requests\\\":{\\\"cpu\\\":\\\"10m\\\",\\\"memory\\\":\\\"50Mi\\\"}},\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-22T00:10:23Z\\\"}},\\\"user\\\":{\\\"linux\\\":{\\\"gid\\\":65534,\\\"supplementalGroups\\\":[65534],\\\"uid\\\":65534}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7xz2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-7bdcf4f5bd-7fjxv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.522499 5116 status_manager.go:919] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f863fff9-286a-45fa-b8f0-8a86994b8440\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-22T00:10:22Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l7w75\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-5bb8f5cd97-xdvz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.551141 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.551313 5116 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.551410 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs podName:94c19a90-c2c9-4236-98be-a0516dbb840b nodeName:}" failed. No retries permitted until 2026-03-22 00:10:26.551391874 +0000 UTC m=+97.573693247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs") pod "network-metrics-daemon-wlq8c" (UID: "94c19a90-c2c9-4236-98be-a0516dbb840b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.591611 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.591652 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.591661 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.591676 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.591692 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.694224 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.694274 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.694286 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.694303 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.694316 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.696391 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.696511 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.696532 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.696561 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.696561 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.696629 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.696710 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:24 crc kubenswrapper[5116]: E0322 00:10:24.696783 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.796965 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.797025 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.797066 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.797093 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.797112 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.899930 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.899978 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.899990 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.900009 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:24 crc kubenswrapper[5116]: I0322 00:10:24.900021 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:24Z","lastTransitionTime":"2026-03-22T00:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.002208 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.002274 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.002289 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.002309 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.002320 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.069725 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="3a773f1e36416b2c124cfa148df2f80bb14dcce04409d501d212d4552fb6fdab" exitCode=0 Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.069816 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"3a773f1e36416b2c124cfa148df2f80bb14dcce04409d501d212d4552fb6fdab"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.074021 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.074074 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.074118 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.074131 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.074142 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.104338 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.104376 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.104385 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.104400 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.104418 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.118109 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=4.118088376 podStartE2EDuration="4.118088376s" podCreationTimestamp="2026-03-22 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.117376573 +0000 UTC m=+96.139677966" watchObservedRunningTime="2026-03-22 00:10:25.118088376 +0000 UTC m=+96.140389749" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.153414 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.1533991869999998 podStartE2EDuration="3.153399187s" podCreationTimestamp="2026-03-22 00:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.153018784 +0000 UTC m=+96.175320177" watchObservedRunningTime="2026-03-22 00:10:25.153399187 +0000 UTC m=+96.175700560" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.204149 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9sq6c" podStartSLOduration=75.204125515 podStartE2EDuration="1m15.204125515s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.204103485 +0000 UTC m=+96.226404858" watchObservedRunningTime="2026-03-22 00:10:25.204125515 +0000 UTC m=+96.226426888" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.206424 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.206461 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.206471 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.206487 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.206498 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.223082 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" podStartSLOduration=75.223058547 podStartE2EDuration="1m15.223058547s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.222715796 +0000 UTC m=+96.245017169" watchObservedRunningTime="2026-03-22 00:10:25.223058547 +0000 UTC m=+96.245359920" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.237013 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2rwjp" podStartSLOduration=75.236996419 podStartE2EDuration="1m15.236996419s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.236370749 +0000 UTC m=+96.258672122" watchObservedRunningTime="2026-03-22 00:10:25.236996419 +0000 UTC m=+96.259297792" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.281994 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.281961866 podStartE2EDuration="4.281961866s" podCreationTimestamp="2026-03-22 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.271186973 +0000 UTC m=+96.293488356" watchObservedRunningTime="2026-03-22 00:10:25.281961866 +0000 UTC m=+96.304263239" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.304235 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nwnjb" podStartSLOduration=76.303384675 podStartE2EDuration="1m16.303384675s" podCreationTimestamp="2026-03-22 00:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.282884935 +0000 UTC m=+96.305186308" watchObservedRunningTime="2026-03-22 00:10:25.303384675 +0000 UTC m=+96.325686038" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.304403 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=3.304388618 podStartE2EDuration="3.304388618s" podCreationTimestamp="2026-03-22 00:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.300555996 +0000 UTC m=+96.322857369" watchObservedRunningTime="2026-03-22 00:10:25.304388618 +0000 UTC m=+96.326689991" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.308680 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.308713 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.308722 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.308733 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.308742 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.411398 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.411560 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.411572 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.411586 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.411610 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.514810 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.514854 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.514869 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.514883 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.514894 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.616314 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.616346 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.616355 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.616368 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.616378 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.718086 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.718124 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.718132 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.718145 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.718157 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.820454 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.820782 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.820796 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.820813 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.820825 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.923295 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.923350 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.923362 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.923381 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:25 crc kubenswrapper[5116]: I0322 00:10:25.923396 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:25Z","lastTransitionTime":"2026-03-22T00:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.024955 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.025015 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.025029 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.025047 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.025062 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.078401 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5jnd7" event={"ID":"428b39f5-eb1c-4f65-b7a4-eeb6e84860cc","Type":"ContainerStarted","Data":"c6721d40927bfc9e64db3dad19ca523af11553569d2a7c235c5ff4c77b73a4b7"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.080071 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerStarted","Data":"c6ce593ebbff0cbc20d49368fa0db752daf86b2c7c0ce84e6c3ed7de07047717"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.083929 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.102916 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podStartSLOduration=76.102856763 podStartE2EDuration="1m16.102856763s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:25.388094554 +0000 UTC m=+96.410395957" watchObservedRunningTime="2026-03-22 00:10:26.102856763 +0000 UTC m=+97.125158136" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.127187 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.127233 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.127243 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.127260 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.127271 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.229854 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.229900 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.229910 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.229926 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.229936 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.331978 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.332024 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.332035 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.332050 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.332060 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.379304 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.379361 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.379387 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.379415 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379583 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379609 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379622 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379675 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379734 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379770 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379776 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379783 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379685 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.379666878 +0000 UTC m=+101.401968261 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379867 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.379850523 +0000 UTC m=+101.402151896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379892 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.379879374 +0000 UTC m=+101.402180787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.379907 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.379900524 +0000 UTC m=+101.402201967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.435096 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.435185 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.435215 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.435236 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.435248 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.480156 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.480347 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.480326822 +0000 UTC m=+101.502628195 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.537492 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.537543 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.537555 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.537573 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.537585 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.581439 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.581626 5116 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.581717 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs podName:94c19a90-c2c9-4236-98be-a0516dbb840b nodeName:}" failed. No retries permitted until 2026-03-22 00:10:30.581697898 +0000 UTC m=+101.603999271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs") pod "network-metrics-daemon-wlq8c" (UID: "94c19a90-c2c9-4236-98be-a0516dbb840b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.639316 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.639371 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.639383 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.639400 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.639412 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.696453 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.696530 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.696623 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.696624 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.696771 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.696835 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.696974 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:26 crc kubenswrapper[5116]: E0322 00:10:26.697243 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.742377 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.742431 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.742443 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.742461 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.742477 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.845090 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.845141 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.845152 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.845194 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.845208 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.948475 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.948544 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.948557 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.948577 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:26 crc kubenswrapper[5116]: I0322 00:10:26.948590 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:26Z","lastTransitionTime":"2026-03-22T00:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.050640 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.050699 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.050714 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.050731 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.050743 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.089090 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="c6ce593ebbff0cbc20d49368fa0db752daf86b2c7c0ce84e6c3ed7de07047717" exitCode=0 Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.089206 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"c6ce593ebbff0cbc20d49368fa0db752daf86b2c7c0ce84e6c3ed7de07047717"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.152942 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.153220 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.153232 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.153244 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.153254 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.255817 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.255860 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.255871 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.255885 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.255895 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.358281 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.358319 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.358348 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.358363 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.358372 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.460780 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.460829 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.460843 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.460859 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.460872 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.563224 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.563270 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.563281 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.563297 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.563309 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.665424 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.665480 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.665491 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.665508 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.665520 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.766792 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.766844 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.766859 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.766876 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.766890 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.869187 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.869234 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.869248 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.869264 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.869278 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.971189 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.971229 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.971240 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.971256 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:27 crc kubenswrapper[5116]: I0322 00:10:27.971269 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:27Z","lastTransitionTime":"2026-03-22T00:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.073388 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.073798 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.073812 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.073829 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.073842 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:28Z","lastTransitionTime":"2026-03-22T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.094653 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="865bb892d10d2cc98809e713630d1eee197b60ebea411ebed87f389b4f38d103" exitCode=0 Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.094745 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"865bb892d10d2cc98809e713630d1eee197b60ebea411ebed87f389b4f38d103"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.101402 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.171851 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.171911 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.171924 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.171973 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.171993 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:28Z","lastTransitionTime":"2026-03-22T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.186921 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.186980 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.186993 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.187014 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeNotReady" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.187025 5116 setters.go:618] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-22T00:10:28Z","lastTransitionTime":"2026-03-22T00:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.215115 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r"] Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.221604 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.223570 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"default-dockercfg-hqpm5\"" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.223913 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\"" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.224037 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\"" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.224425 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\"" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.303841 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.303920 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc1de7fa-cee5-4f90-80c9-e9cba187456d-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.304117 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.304221 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc1de7fa-cee5-4f90-80c9-e9cba187456d-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.304269 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc1de7fa-cee5-4f90-80c9-e9cba187456d-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405558 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc1de7fa-cee5-4f90-80c9-e9cba187456d-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405649 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405675 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc1de7fa-cee5-4f90-80c9-e9cba187456d-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405703 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc1de7fa-cee5-4f90-80c9-e9cba187456d-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405762 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405825 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.405872 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/cc1de7fa-cee5-4f90-80c9-e9cba187456d-etc-ssl-certs\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.406687 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cc1de7fa-cee5-4f90-80c9-e9cba187456d-service-ca\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.419036 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc1de7fa-cee5-4f90-80c9-e9cba187456d-serving-cert\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.430483 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cc1de7fa-cee5-4f90-80c9-e9cba187456d-kube-api-access\") pod \"cluster-version-operator-7c9b9cfd6-jrw2r\" (UID: \"cc1de7fa-cee5-4f90-80c9-e9cba187456d\") " pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.544346 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" Mar 22 00:10:28 crc kubenswrapper[5116]: W0322 00:10:28.559370 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc1de7fa_cee5_4f90_80c9_e9cba187456d.slice/crio-d4ede4220a98a9362f63deee58b33a8f5ceb6d2314eba0d669dc951f2c019bbf WatchSource:0}: Error finding container d4ede4220a98a9362f63deee58b33a8f5ceb6d2314eba0d669dc951f2c019bbf: Status 404 returned error can't find the container with id d4ede4220a98a9362f63deee58b33a8f5ceb6d2314eba0d669dc951f2c019bbf Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.688859 5116 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.696762 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.696762 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.696853 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.696860 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:28 crc kubenswrapper[5116]: E0322 00:10:28.697001 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:28 crc kubenswrapper[5116]: E0322 00:10:28.697115 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:28 crc kubenswrapper[5116]: E0322 00:10:28.697191 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:28 crc kubenswrapper[5116]: E0322 00:10:28.697262 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:28 crc kubenswrapper[5116]: I0322 00:10:28.698812 5116 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 22 00:10:29 crc kubenswrapper[5116]: I0322 00:10:29.105919 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" event={"ID":"cc1de7fa-cee5-4f90-80c9-e9cba187456d","Type":"ContainerStarted","Data":"d4ede4220a98a9362f63deee58b33a8f5ceb6d2314eba0d669dc951f2c019bbf"} Mar 22 00:10:29 crc kubenswrapper[5116]: I0322 00:10:29.110138 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="7523ff5e45ba0c266da43232745d320ee15d7059b1b8414c9d98217b5eabcc7f" exitCode=0 Mar 22 00:10:29 crc kubenswrapper[5116]: I0322 00:10:29.110225 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"7523ff5e45ba0c266da43232745d320ee15d7059b1b8414c9d98217b5eabcc7f"} Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.116713 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerStarted","Data":"00794bbb5aae6e05bd0e50af39e0b4d4e26deff3b187d7676d95950bdffb8dd8"} Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.121364 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerStarted","Data":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.121644 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.121702 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.121712 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.123030 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" event={"ID":"cc1de7fa-cee5-4f90-80c9-e9cba187456d","Type":"ContainerStarted","Data":"e5174ff347d5ce3b36e185f0c9b14090e9c9bb0ff5f625d98f27d20f17fc867e"} Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.185747 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.196158 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.202288 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podStartSLOduration=80.202269003 podStartE2EDuration="1m20.202269003s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:30.172330303 +0000 UTC m=+101.194631696" watchObservedRunningTime="2026-03-22 00:10:30.202269003 +0000 UTC m=+101.224570386" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.202513 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-7c9b9cfd6-jrw2r" podStartSLOduration=80.202508151 podStartE2EDuration="1m20.202508151s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:30.201377855 +0000 UTC m=+101.223679228" watchObservedRunningTime="2026-03-22 00:10:30.202508151 +0000 UTC m=+101.224809524" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.433725 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.434114 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.434145 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.434198 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434368 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434385 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434394 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434406 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434433 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434459 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434472 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434446 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.43443243 +0000 UTC m=+109.456733803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434527 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434528 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.434510932 +0000 UTC m=+109.456812305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434583 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.434569254 +0000 UTC m=+109.456870677 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.434605 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.434591005 +0000 UTC m=+109.456892408 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.535233 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.535436 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.535404964 +0000 UTC m=+109.557706337 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.637233 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.637458 5116 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.637559 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs podName:94c19a90-c2c9-4236-98be-a0516dbb840b nodeName:}" failed. No retries permitted until 2026-03-22 00:10:38.637535325 +0000 UTC m=+109.659836698 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs") pod "network-metrics-daemon-wlq8c" (UID: "94c19a90-c2c9-4236-98be-a0516dbb840b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.697531 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.697599 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.697668 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.697780 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.697849 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.697940 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:30 crc kubenswrapper[5116]: I0322 00:10:30.698004 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:30 crc kubenswrapper[5116]: E0322 00:10:30.698057 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:31 crc kubenswrapper[5116]: I0322 00:10:31.128795 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="00794bbb5aae6e05bd0e50af39e0b4d4e26deff3b187d7676d95950bdffb8dd8" exitCode=0 Mar 22 00:10:31 crc kubenswrapper[5116]: I0322 00:10:31.129457 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"00794bbb5aae6e05bd0e50af39e0b4d4e26deff3b187d7676d95950bdffb8dd8"} Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.134541 5116 generic.go:358] "Generic (PLEG): container finished" podID="68dcbc21-b4ce-4285-9a4b-101724f82f33" containerID="7c538fd9b69d6bb7a46afcbfeec835f156752a4d080e583406021eb7d30194a7" exitCode=0 Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.134604 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerDied","Data":"7c538fd9b69d6bb7a46afcbfeec835f156752a4d080e583406021eb7d30194a7"} Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.244283 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wlq8c"] Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.244480 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:32 crc kubenswrapper[5116]: E0322 00:10:32.244622 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.696879 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.696925 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:32 crc kubenswrapper[5116]: E0322 00:10:32.697308 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:32 crc kubenswrapper[5116]: E0322 00:10:32.697390 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:32 crc kubenswrapper[5116]: I0322 00:10:32.696963 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:32 crc kubenswrapper[5116]: E0322 00:10:32.697473 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:33 crc kubenswrapper[5116]: I0322 00:10:33.141276 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bk75f" event={"ID":"68dcbc21-b4ce-4285-9a4b-101724f82f33","Type":"ContainerStarted","Data":"42f55028ebe0714ec8dc6e34ea7f6aaa1c102bd83b7f126abea22b6b5c324427"} Mar 22 00:10:33 crc kubenswrapper[5116]: I0322 00:10:33.697195 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:33 crc kubenswrapper[5116]: E0322 00:10:33.697387 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:34 crc kubenswrapper[5116]: I0322 00:10:34.696803 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:34 crc kubenswrapper[5116]: I0322 00:10:34.696813 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:34 crc kubenswrapper[5116]: E0322 00:10:34.696954 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:34 crc kubenswrapper[5116]: E0322 00:10:34.697084 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:34 crc kubenswrapper[5116]: I0322 00:10:34.697142 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:34 crc kubenswrapper[5116]: E0322 00:10:34.697255 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:35 crc kubenswrapper[5116]: I0322 00:10:35.696807 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:35 crc kubenswrapper[5116]: E0322 00:10:35.696956 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wlq8c" podUID="94c19a90-c2c9-4236-98be-a0516dbb840b" Mar 22 00:10:36 crc kubenswrapper[5116]: I0322 00:10:36.696713 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:36 crc kubenswrapper[5116]: E0322 00:10:36.697143 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" podUID="f863fff9-286a-45fa-b8f0-8a86994b8440" Mar 22 00:10:36 crc kubenswrapper[5116]: I0322 00:10:36.696778 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:36 crc kubenswrapper[5116]: I0322 00:10:36.696746 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:36 crc kubenswrapper[5116]: E0322 00:10:36.697270 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" podUID="6a9ae5f6-97bd-46ac-bafa-ca1b4452a141" Mar 22 00:10:36 crc kubenswrapper[5116]: E0322 00:10:36.697453 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fhkjl" podUID="17b87002-b798-480a-8e17-83053d698239" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.534690 5116 kubelet_node_status.go:736] "Recording event message for node" node="crc" event="NodeReady" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.534818 5116 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.569504 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bk75f" podStartSLOduration=87.569483485 podStartE2EDuration="1m27.569483485s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:33.163539568 +0000 UTC m=+104.185840941" watchObservedRunningTime="2026-03-22 00:10:37.569483485 +0000 UTC m=+108.591784858" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.570133 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-m5dds"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.577540 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-f59q2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.577780 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.579775 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"etcd-client\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.580091 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-4zqgh\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.583510 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.584439 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"audit-1\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.584879 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.585045 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.587018 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"encryption-config-1\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.587402 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.587722 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.587962 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"etcd-serving-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.588296 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"image-import-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.588976 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-w2nq2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.590427 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.593382 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-54c688565-5drp9"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.593723 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.594282 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"trusted-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.597308 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29568960-tjk88"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.598393 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.598427 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.600432 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.603369 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-747b44746d-cb5p2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.606078 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.607152 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.607757 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.611346 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.612329 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.613664 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"audit-1\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.614658 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-qqw4z\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.614731 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.614765 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.614917 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.615073 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.615212 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.623642 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.624612 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.624916 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-rbac-proxy\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.625112 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-images\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.625316 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-djmfg\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.625504 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.625684 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.625858 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"client-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.626023 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"serviceca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.626272 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"pruner-dockercfg-rs58m\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.626513 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.627085 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.627299 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.628042 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.628099 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-6n5ln\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.628544 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.629362 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.629606 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.629865 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.634216 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mdwwj\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.636942 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.641833 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.643088 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.648188 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.648331 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-wzhvk\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.648531 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-tls\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.649272 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-mmcpt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.649549 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.649925 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-client\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.650961 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.651203 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.651474 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.651683 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.653685 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"client-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.653700 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.655090 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-67c89758df-vnd4f"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.661256 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-global-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.662435 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-wb6r8"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.662547 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.663416 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.663796 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.663852 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.663992 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.664000 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.664227 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.664578 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.664980 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-6tbpn\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.666689 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"service-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.666875 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.667003 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.667199 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-6c46w\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.667257 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-8qfhd"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.667501 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.670236 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.670400 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.670548 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.670894 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jmhxf\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671142 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671281 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"config-operator-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671363 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671594 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kl6m8\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671753 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.671937 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-sjn6s\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.672066 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.672227 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.672359 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.677296 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64d44f6ddf-9g5sg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.678301 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.678397 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.680120 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.682756 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.687614 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.688076 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.688317 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-dockercfg-jcmfj\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.689710 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.703037 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.707505 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:37 crc kubenswrapper[5116]: E0322 00:10:37.708747 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(3a14caf222afb62aaabdc47808b6f944)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.730246 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.731602 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.731736 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.738623 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.738784 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.740489 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750327 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750441 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750516 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8dkm8\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750343 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750549 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.750625 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755245 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755453 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755595 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"openshift-service-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755709 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755756 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"kube-root-ca.crt\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.755997 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a442ae21-7eff-4990-998f-27afcb839a6c-config\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756030 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbrfl\" (UniqueName: \"kubernetes.io/projected/a442ae21-7eff-4990-998f-27afcb839a6c-kube-api-access-hbrfl\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756037 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-session\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756234 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756279 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756534 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756914 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.756936 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"audit\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757431 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9648k\" (UniqueName: \"kubernetes.io/projected/bac29c51-c815-4827-bd3d-c74f2e31f842-kube-api-access-9648k\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757462 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757497 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757525 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7xbb\" (UniqueName: \"kubernetes.io/projected/9884d9ba-fbeb-40db-8105-de302262478b-kube-api-access-c7xbb\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757575 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pngr\" (UniqueName: \"kubernetes.io/projected/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-kube-api-access-4pngr\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757646 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757674 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-auth-proxy-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757718 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp728\" (UniqueName: \"kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757772 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9lrd\" (UniqueName: \"kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757846 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757869 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-d2bf2\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757882 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757912 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-serving-cert\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.757933 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-config\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758001 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq7tp\" (UniqueName: \"kubernetes.io/projected/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-kube-api-access-tq7tp\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758068 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-client\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758108 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrpct\" (UniqueName: \"kubernetes.io/projected/45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0-kube-api-access-xrpct\") pod \"downloads-747b44746d-cb5p2\" (UID: \"45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0\") " pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758143 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-encryption-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758194 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-trusted-ca-bundle\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758220 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-images\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758241 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-policies\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758280 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-dir\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758362 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzqwb\" (UniqueName: \"kubernetes.io/projected/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-kube-api-access-bzqwb\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758412 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9884d9ba-fbeb-40db-8105-de302262478b-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758477 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a442ae21-7eff-4990-998f-27afcb839a6c-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758495 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758509 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758551 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758566 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-image-import-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758615 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit-dir\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758648 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758675 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvntc\" (UniqueName: \"kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758712 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-config\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758744 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-node-pullsecrets\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758762 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758778 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bac29c51-c815-4827-bd3d-c74f2e31f842-serving-cert\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758830 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758886 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758930 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-encryption-config\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.758957 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759027 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-machine-approver-tls\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759081 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-serving-cert\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759098 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759114 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759143 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-serving-ca\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759158 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759202 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-client\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.759224 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.765025 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.768207 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.768763 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-42tp2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.771491 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.777657 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.777792 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-68cf44c8b8-2jlxw"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.781577 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.781631 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.781827 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.781887 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.783036 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.790647 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.790813 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.795004 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.795270 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.797104 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.797469 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.798998 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t8n29\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.800694 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.800983 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.803912 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-bkst6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.804367 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.809921 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.810000 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.816897 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.817013 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.822744 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.822926 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.828570 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-747b44746d-cb5p2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.828601 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.828732 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.833077 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.833380 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.835839 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-wb6r8"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.835872 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.835886 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-f59q2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.835900 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.836384 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.839211 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.839496 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.854694 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.855053 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858293 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858324 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858337 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858349 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-m5dds"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858362 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858376 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-74545575db-z8df4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.858510 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.859911 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.859941 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-image-import-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860069 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit-dir\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860102 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860226 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit-dir\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860238 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvntc\" (UniqueName: \"kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860480 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-config\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860496 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-node-pullsecrets\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860530 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860549 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bac29c51-c815-4827-bd3d-c74f2e31f842-serving-cert\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860568 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860586 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860603 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-encryption-config\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860618 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.860614 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-node-pullsecrets\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861117 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861157 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-image-import-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861534 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29568960-tjk88"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861560 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d44f6ddf-9g5sg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861577 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-42tp2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861591 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861604 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861618 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861661 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-config\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861666 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861801 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861854 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-machine-approver-tls\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861971 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-serving-cert\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.861990 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862008 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862064 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-serving-ca\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862082 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862104 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-client\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862119 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862147 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a442ae21-7eff-4990-998f-27afcb839a6c-config\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862225 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862244 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbrfl\" (UniqueName: \"kubernetes.io/projected/a442ae21-7eff-4990-998f-27afcb839a6c-kube-api-access-hbrfl\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862679 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-service-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.862771 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-audit\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.863807 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.863869 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-serving-ca\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864432 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bac29c51-c815-4827-bd3d-c74f2e31f842-trusted-ca-bundle\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864480 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-trusted-ca-bundle\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864521 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9648k\" (UniqueName: \"kubernetes.io/projected/bac29c51-c815-4827-bd3d-c74f2e31f842-kube-api-access-9648k\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864549 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864574 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864590 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7xbb\" (UniqueName: \"kubernetes.io/projected/9884d9ba-fbeb-40db-8105-de302262478b-kube-api-access-c7xbb\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864608 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pngr\" (UniqueName: \"kubernetes.io/projected/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-kube-api-access-4pngr\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864631 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864648 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-auth-proxy-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864664 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp728\" (UniqueName: \"kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864682 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9lrd\" (UniqueName: \"kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864705 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864721 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864740 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-serving-cert\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864756 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-config\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864773 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tq7tp\" (UniqueName: \"kubernetes.io/projected/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-kube-api-access-tq7tp\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864802 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-client\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864829 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrpct\" (UniqueName: \"kubernetes.io/projected/45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0-kube-api-access-xrpct\") pod \"downloads-747b44746d-cb5p2\" (UID: \"45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0\") " pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864855 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-encryption-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864876 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-trusted-ca-bundle\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864893 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-images\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864909 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-policies\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864936 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-dir\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864955 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzqwb\" (UniqueName: \"kubernetes.io/projected/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-kube-api-access-bzqwb\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.864982 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9884d9ba-fbeb-40db-8105-de302262478b-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865013 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a442ae21-7eff-4990-998f-27afcb839a6c-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865029 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865044 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865138 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a442ae21-7eff-4990-998f-27afcb839a6c-config\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865458 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.865866 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-dir\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.866531 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-auth-proxy-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867158 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867158 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bac29c51-c815-4827-bd3d-c74f2e31f842-serving-cert\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867197 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-trusted-ca-bundle\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867328 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-config\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867602 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-audit-policies\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.867804 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.868009 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-encryption-config\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.868324 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-serving-ca\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.868755 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.870751 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-config\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.871134 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9884d9ba-fbeb-40db-8105-de302262478b-images\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.871546 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-serving-cert\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.871749 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-serving-cert\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.871755 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-encryption-config\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.872139 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.872538 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.872840 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-machine-approver-tls\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.872942 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a442ae21-7eff-4990-998f-27afcb839a6c-serving-cert\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.873016 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-etcd-client\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.873201 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-etcd-client\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.873682 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.875050 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9884d9ba-fbeb-40db-8105-de302262478b-machine-api-operator-tls\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.875327 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.876545 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.876585 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.876692 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.880800 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.896657 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.896795 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.899819 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"cluster-image-registry-operator-dockercfg-ntnd7\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902807 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902832 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902845 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902855 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902866 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-w2nq2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902883 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-r75lk"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.902980 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.909781 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-55dml"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.909983 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.913432 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.913561 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.917159 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mzck5"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.917366 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.919039 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-operator-tls\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.922329 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.922359 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.922370 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-67c89758df-vnd4f"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.922379 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-57nbs"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.922450 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.929921 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-74545575db-z8df4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.929947 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.929957 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-bkst6"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.929968 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rkswl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.930065 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934802 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934868 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-8qfhd"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934885 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934898 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934915 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934928 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934940 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934941 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934952 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934967 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mzck5"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934983 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rkswl"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.934993 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.935011 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-55dml"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.935023 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn"] Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.939584 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.959054 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.980377 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"dns-operator-dockercfg-wbbsn\"" Mar 22 00:10:37 crc kubenswrapper[5116]: I0322 00:10:37.999703 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.019136 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"metrics-tls\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.039310 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.059241 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kw8fx\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.080473 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-certs-default\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.099622 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.119897 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.139688 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.159386 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.179990 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.200465 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.219580 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"ingress-operator-dockercfg-74nwh\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.240599 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"metrics-tls\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.267149 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"trusted-ca\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.279084 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.299927 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-dockercfg-bf7fj\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.319964 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-config\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.340253 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.360590 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.379809 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.400602 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.420391 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-dockercfg-tnfx9\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.439480 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-config\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.459462 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-dockercfg-2wbn2\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.474089 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.474151 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.474232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.474260 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474267 5116 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474406 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.474376777 +0000 UTC m=+125.496678150 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474448 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474480 5116 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474583 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert podName:6a9ae5f6-97bd-46ac-bafa-ca1b4452a141 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.474563583 +0000 UTC m=+125.496864956 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert") pod "networking-console-plugin-5ff7774fd9-nljh6" (UID: "6a9ae5f6-97bd-46ac-bafa-ca1b4452a141") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474486 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474627 5116 projected.go:194] Error preparing data for projected volume kube-api-access-gwt8b for pod openshift-network-diagnostics/network-check-target-fhkjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474660 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b podName:17b87002-b798-480a-8e17-83053d698239 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.474653996 +0000 UTC m=+125.496955389 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gwt8b" (UniqueName: "kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b") pod "network-check-target-fhkjl" (UID: "17b87002-b798-480a-8e17-83053d698239") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474466 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474690 5116 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474704 5116 projected.go:194] Error preparing data for projected volume kube-api-access-l7w75 for pod openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.474739 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75 podName:f863fff9-286a-45fa-b8f0-8a86994b8440 nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.474733109 +0000 UTC m=+125.497034482 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7w75" (UniqueName: "kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75") pod "network-check-source-5bb8f5cd97-xdvz5" (UID: "f863fff9-286a-45fa-b8f0-8a86994b8440") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.480355 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-config\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.499570 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.519556 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-scheduler-operator-serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.539310 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serviceaccount-dockercfg-4gqzj\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.559122 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.575714 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:38 crc kubenswrapper[5116]: E0322 00:10:38.576097 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.576059824 +0000 UTC m=+125.598361247 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.579374 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.600556 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"package-server-manager-serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.619699 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-admission-controller-secret\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.640344 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ac-dockercfg-gj7jx\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.659699 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2h6bs\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.677723 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.679294 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.683189 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94c19a90-c2c9-4236-98be-a0516dbb840b-metrics-certs\") pod \"network-metrics-daemon-wlq8c\" (UID: \"94c19a90-c2c9-4236-98be-a0516dbb840b\") " pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.696568 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.696739 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.696768 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.699940 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.720362 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.740565 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.779555 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mcc-proxy-tls\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.792072 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wlq8c" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.800699 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-controller-dockercfg-xnj77\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.819660 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mco-proxy-tls\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.837669 5116 request.go:752] "Waited before sending request" delay="1.008693167s" reason="client-side throttling, not priority and fairness" verb="GET" URL="https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.839493 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-images\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.858801 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-dockercfg-sw6nc\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.880253 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-service-ca-bundle\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.900676 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-serving-cert\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.919672 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-ca-bundle\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.940357 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-client\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.960597 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.981017 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-dockercfg-4vdnc\"" Mar 22 00:10:38 crc kubenswrapper[5116]: I0322 00:10:38.999668 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.019383 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-config\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.039283 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kknhg\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.059326 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.080486 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.120138 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"catalog-operator-serving-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.139458 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"pprof-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.160525 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serving-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.179765 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"packageserver-service-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.183991 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.184686 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:39.684662566 +0000 UTC m=+110.706963979 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185189 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413bd8dc-5257-4fd7-95c1-01f6d79278ee-config\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185230 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/413bd8dc-5257-4fd7-95c1-01f6d79278ee-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185255 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/720e1d69-81b3-4fdb-94c1-dabb0707c833-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185281 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185309 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185332 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-trusted-ca-bundle\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185459 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-trusted-ca\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185537 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185597 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185681 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-service-ca\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185728 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qljht\" (UniqueName: \"kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185820 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185862 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkpc6\" (UniqueName: \"kubernetes.io/projected/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-kube-api-access-xkpc6\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185909 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv46d\" (UniqueName: \"kubernetes.io/projected/413bd8dc-5257-4fd7-95c1-01f6d79278ee-kube-api-access-tv46d\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.185957 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.186002 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4117709-89bd-4e72-8016-0c25c0ece2c6-serving-cert\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.186046 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95r42\" (UniqueName: \"kubernetes.io/projected/b4117709-89bd-4e72-8016-0c25c0ece2c6-kube-api-access-95r42\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.186091 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.187614 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.187692 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188317 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/720e1d69-81b3-4fdb-94c1-dabb0707c833-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188395 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtlfq\" (UniqueName: \"kubernetes.io/projected/3bf5ae18-6e08-436b-939f-03347eda68a8-kube-api-access-wtlfq\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188430 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqc8c\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-kube-api-access-zqc8c\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188503 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188596 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188685 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413bd8dc-5257-4fd7-95c1-01f6d79278ee-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188770 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-available-featuregates\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188846 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188894 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188937 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-tmp\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.188966 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189093 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189123 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bf5ae18-6e08-436b-939f-03347eda68a8-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189152 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsw9z\" (UniqueName: \"kubernetes.io/projected/4c2755ce-817d-47b0-9f19-7218641d0c5b-kube-api-access-zsw9z\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189211 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189247 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-oauth-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189276 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189306 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189343 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-serving-cert\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189371 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189402 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189435 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189470 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189505 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl2cn\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189533 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-config\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189569 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-oauth-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.189606 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.216991 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvntc\" (UniqueName: \"kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc\") pod \"controller-manager-65b6cccf98-9kdkj\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.219739 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.226880 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wlq8c"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.241732 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.259994 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.290596 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.290815 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:39.790786273 +0000 UTC m=+110.813087646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.290900 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl2cn\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.290940 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-csi-data-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.290961 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-srv-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.290987 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291080 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-config\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291244 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291276 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-node-bootstrap-token\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291294 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56318568-cab8-4d5b-9a20-4531fc8aad60-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291311 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgc4\" (UniqueName: \"kubernetes.io/projected/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-kube-api-access-psgc4\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291328 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2512f5ef-a611-4637-b41f-41185def421b-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291459 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413bd8dc-5257-4fd7-95c1-01f6d79278ee-config\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291534 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/413bd8dc-5257-4fd7-95c1-01f6d79278ee-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291596 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56318568-cab8-4d5b-9a20-4531fc8aad60-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291651 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prql2\" (UniqueName: \"kubernetes.io/projected/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-kube-api-access-prql2\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.291688 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:39.791666811 +0000 UTC m=+110.813968344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291759 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/720e1d69-81b3-4fdb-94c1-dabb0707c833-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291790 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291822 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291845 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-trusted-ca\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291866 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291888 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291914 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56318568-cab8-4d5b-9a20-4531fc8aad60-config\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291939 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qljht\" (UniqueName: \"kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291963 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-srv-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.291981 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292006 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292038 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkpc6\" (UniqueName: \"kubernetes.io/projected/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-kube-api-access-xkpc6\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292055 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292071 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/413bd8dc-5257-4fd7-95c1-01f6d79278ee-config\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292070 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-profile-collector-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292097 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttvwk\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-kube-api-access-ttvwk\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292114 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkbsf\" (UniqueName: \"kubernetes.io/projected/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-kube-api-access-fkbsf\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292130 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lck5k\" (UniqueName: \"kubernetes.io/projected/a1258288-8146-4cba-9d66-2a88e35a1fe9-kube-api-access-lck5k\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292160 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/413bd8dc-5257-4fd7-95c1-01f6d79278ee-tmp\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292324 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65b5ebf3-054c-4827-96b2-7ea0a26f20af-tmp-dir\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292389 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292427 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6g2r\" (UniqueName: \"kubernetes.io/projected/73928a0d-7a97-4c03-a5e8-6ab37119261c-kube-api-access-p6g2r\") pod \"migrator-866fcbc849-m5rc6\" (UID: \"73928a0d-7a97-4c03-a5e8-6ab37119261c\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292457 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-serving-cert\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292495 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292535 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/720e1d69-81b3-4fdb-94c1-dabb0707c833-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292571 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xgch\" (UniqueName: \"kubernetes.io/projected/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-kube-api-access-5xgch\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292603 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7b12ebb-b568-4d15-abde-14db5041d5d2-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292643 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292679 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292716 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292755 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413bd8dc-5257-4fd7-95c1-01f6d79278ee-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292785 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-serving-cert\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292816 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mkbf\" (UniqueName: \"kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292852 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292884 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-cabundle\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292916 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-webhook-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292953 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.292990 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-tmp\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293017 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293025 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e67bda-2839-4364-9a75-54864090dc1f-config\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293058 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-images\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293090 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293205 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293635 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293687 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-stats-auth\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293724 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf866f04-6739-40da-8c1c-36d192472220-tmp-dir\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293748 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2512f5ef-a611-4637-b41f-41185def421b-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293769 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-client\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.293957 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.294391 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-trusted-ca\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.294442 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56318568-cab8-4d5b-9a20-4531fc8aad60-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.294518 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.294803 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.294889 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsw9z\" (UniqueName: \"kubernetes.io/projected/4c2755ce-817d-47b0-9f19-7218641d0c5b-kube-api-access-zsw9z\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295045 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-plugins-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295081 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eea7869-af21-4009-856f-65219d64ceea-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295213 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295244 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295264 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295309 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-certs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295362 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295397 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2512f5ef-a611-4637-b41f-41185def421b-config\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295515 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-socket-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295549 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4fhd\" (UniqueName: \"kubernetes.io/projected/1bb9e03f-ef85-4dbe-802f-d529e97b092c-kube-api-access-h4fhd\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295618 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crwp9\" (UniqueName: \"kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295776 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.295878 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-config\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296196 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf866f04-6739-40da-8c1c-36d192472220-kube-api-access\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296277 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296329 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg9xh\" (UniqueName: \"kubernetes.io/projected/65b5ebf3-054c-4827-96b2-7ea0a26f20af-kube-api-access-sg9xh\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296380 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-config\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296442 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-oauth-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296492 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296543 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296594 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-tmp-dir\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296651 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296701 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e67bda-2839-4364-9a75-54864090dc1f-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296744 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4117709-89bd-4e72-8016-0c25c0ece2c6-config\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296755 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcvxk\" (UniqueName: \"kubernetes.io/projected/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-kube-api-access-mcvxk\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296645 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296909 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.296973 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd443947-7241-49e3-9d98-f55329818dcc-webhook-certs\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297029 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-trusted-ca-bundle\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297077 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2512f5ef-a611-4637-b41f-41185def421b-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297122 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-key\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297222 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65b5ebf3-054c-4827-96b2-7ea0a26f20af-metrics-tls\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297339 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-service-ca\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297420 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4117709-89bd-4e72-8016-0c25c0ece2c6-serving-cert\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297476 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95r42\" (UniqueName: \"kubernetes.io/projected/b4117709-89bd-4e72-8016-0c25c0ece2c6-kube-api-access-95r42\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297525 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297605 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rjsf\" (UniqueName: \"kubernetes.io/projected/93e67bda-2839-4364-9a75-54864090dc1f-kube-api-access-4rjsf\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297669 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297747 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297812 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a1258288-8146-4cba-9d66-2a88e35a1fe9-tmpfs\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297895 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-service-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297951 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tv46d\" (UniqueName: \"kubernetes.io/projected/413bd8dc-5257-4fd7-95c1-01f6d79278ee-kube-api-access-tv46d\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.297990 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298009 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-mountpoint-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298067 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00dffd10-d567-431f-8dd9-390443f26d96-service-ca-bundle\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298119 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xp2b\" (UniqueName: \"kubernetes.io/projected/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-kube-api-access-6xp2b\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298148 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqx7\" (UniqueName: \"kubernetes.io/projected/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-kube-api-access-9vqx7\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298189 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-tmp-dir\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298225 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298247 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzqjt\" (UniqueName: \"kubernetes.io/projected/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-kube-api-access-gzqjt\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298274 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-default-certificate\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298299 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtlfq\" (UniqueName: \"kubernetes.io/projected/3bf5ae18-6e08-436b-939f-03347eda68a8-kube-api-access-wtlfq\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298323 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqc8c\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-kube-api-access-zqc8c\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298345 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-metrics-certs\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298371 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/bd443947-7241-49e3-9d98-f55329818dcc-kube-api-access-52j77\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298407 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298431 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298450 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-cert\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298472 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcwsm\" (UniqueName: \"kubernetes.io/projected/0eea7869-af21-4009-856f-65219d64ceea-kube-api-access-rcwsm\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298495 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-available-featuregates\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298524 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7b12ebb-b568-4d15-abde-14db5041d5d2-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298544 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-metrics-tls\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298567 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298601 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmdw\" (UniqueName: \"kubernetes.io/projected/00dffd10-d567-431f-8dd9-390443f26d96-kube-api-access-8bmdw\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298619 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298646 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf2fs\" (UniqueName: \"kubernetes.io/projected/dc9230ca-3a51-4ee3-976c-38c27605db87-kube-api-access-lf2fs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298668 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf866f04-6739-40da-8c1c-36d192472220-serving-cert\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298718 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-config-volume\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298746 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0eea7869-af21-4009-856f-65219d64ceea-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298803 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxs7g\" (UniqueName: \"kubernetes.io/projected/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-kube-api-access-hxs7g\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298876 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-tmpfs\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.298963 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-oauth-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299004 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-serving-cert\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299057 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299089 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299142 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-registration-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299203 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bf5ae18-6e08-436b-939f-03347eda68a8-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299235 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299293 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktdtc\" (UniqueName: \"kubernetes.io/projected/fe48c9c2-8783-475b-a961-d5a4110cb452-kube-api-access-ktdtc\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299320 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-apiservice-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299389 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fe48c9c2-8783-475b-a961-d5a4110cb452-tmpfs\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299439 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clsgn\" (UniqueName: \"kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299487 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299540 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf866f04-6739-40da-8c1c-36d192472220-config\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.299916 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.300443 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bgxvm\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.300898 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-oauth-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.301405 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-trusted-ca-bundle\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.301616 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.302715 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.303388 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-oauth-serving-cert\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.303887 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/413bd8dc-5257-4fd7-95c1-01f6d79278ee-serving-cert\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.303903 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-console-config\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.304596 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.304930 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.305052 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.305104 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-available-featuregates\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.305237 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4c2755ce-817d-47b0-9f19-7218641d0c5b-service-ca\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.305511 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.306220 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.307367 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.307493 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.308017 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4117709-89bd-4e72-8016-0c25c0ece2c6-serving-cert\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.308246 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bf5ae18-6e08-436b-939f-03347eda68a8-samples-operator-tls\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.308591 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-serving-cert\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.309491 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.309958 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbrfl\" (UniqueName: \"kubernetes.io/projected/a442ae21-7eff-4990-998f-27afcb839a6c-kube-api-access-hbrfl\") pod \"openshift-apiserver-operator-846cbfc458-vdh4n\" (UID: \"a442ae21-7eff-4990-998f-27afcb839a6c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.312785 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.319745 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.320529 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-tmp\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.320597 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted-pem\" (UniqueName: \"kubernetes.io/empty-dir/720e1d69-81b3-4fdb-94c1-dabb0707c833-ca-trust-extracted-pem\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.320945 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/720e1d69-81b3-4fdb-94c1-dabb0707c833-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.321404 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/720e1d69-81b3-4fdb-94c1-dabb0707c833-trusted-ca\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.357812 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzqwb\" (UniqueName: \"kubernetes.io/projected/8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0-kube-api-access-bzqwb\") pod \"apiserver-9ddfb9f55-m5dds\" (UID: \"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0\") " pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.375284 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrpct\" (UniqueName: \"kubernetes.io/projected/45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0-kube-api-access-xrpct\") pod \"downloads-747b44746d-cb5p2\" (UID: \"45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0\") " pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401218 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401652 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a1258288-8146-4cba-9d66-2a88e35a1fe9-tmpfs\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401680 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-service-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401731 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-mountpoint-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401783 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00dffd10-d567-431f-8dd9-390443f26d96-service-ca-bundle\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.401808 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xp2b\" (UniqueName: \"kubernetes.io/projected/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-kube-api-access-6xp2b\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402059 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqx7\" (UniqueName: \"kubernetes.io/projected/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-kube-api-access-9vqx7\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402088 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-tmp-dir\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402453 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzqjt\" (UniqueName: \"kubernetes.io/projected/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-kube-api-access-gzqjt\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402645 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-default-certificate\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402743 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-metrics-certs\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402778 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/bd443947-7241-49e3-9d98-f55329818dcc-kube-api-access-52j77\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402808 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402828 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-cert\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402852 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcwsm\" (UniqueName: \"kubernetes.io/projected/0eea7869-af21-4009-856f-65219d64ceea-kube-api-access-rcwsm\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402878 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7b12ebb-b568-4d15-abde-14db5041d5d2-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402897 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-metrics-tls\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402917 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402950 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bmdw\" (UniqueName: \"kubernetes.io/projected/00dffd10-d567-431f-8dd9-390443f26d96-kube-api-access-8bmdw\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.402973 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403005 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf2fs\" (UniqueName: \"kubernetes.io/projected/dc9230ca-3a51-4ee3-976c-38c27605db87-kube-api-access-lf2fs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403031 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf866f04-6739-40da-8c1c-36d192472220-serving-cert\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403056 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-config-volume\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403077 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0eea7869-af21-4009-856f-65219d64ceea-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403101 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxs7g\" (UniqueName: \"kubernetes.io/projected/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-kube-api-access-hxs7g\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403134 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-tmpfs\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403193 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-registration-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403228 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktdtc\" (UniqueName: \"kubernetes.io/projected/fe48c9c2-8783-475b-a961-d5a4110cb452-kube-api-access-ktdtc\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403251 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-apiservice-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403276 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fe48c9c2-8783-475b-a961-d5a4110cb452-tmpfs\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403301 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clsgn\" (UniqueName: \"kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403325 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf866f04-6739-40da-8c1c-36d192472220-config\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403352 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-csi-data-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403375 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-srv-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403406 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403452 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-config\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403482 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-node-bootstrap-token\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403506 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56318568-cab8-4d5b-9a20-4531fc8aad60-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403529 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psgc4\" (UniqueName: \"kubernetes.io/projected/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-kube-api-access-psgc4\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403556 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2512f5ef-a611-4637-b41f-41185def421b-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403591 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56318568-cab8-4d5b-9a20-4531fc8aad60-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403615 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prql2\" (UniqueName: \"kubernetes.io/projected/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-kube-api-access-prql2\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403658 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56318568-cab8-4d5b-9a20-4531fc8aad60-config\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403682 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-srv-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403704 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403736 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403772 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-profile-collector-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403794 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttvwk\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-kube-api-access-ttvwk\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403818 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkbsf\" (UniqueName: \"kubernetes.io/projected/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-kube-api-access-fkbsf\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403841 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lck5k\" (UniqueName: \"kubernetes.io/projected/a1258288-8146-4cba-9d66-2a88e35a1fe9-kube-api-access-lck5k\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403865 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65b5ebf3-054c-4827-96b2-7ea0a26f20af-tmp-dir\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403892 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403918 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6g2r\" (UniqueName: \"kubernetes.io/projected/73928a0d-7a97-4c03-a5e8-6ab37119261c-kube-api-access-p6g2r\") pod \"migrator-866fcbc849-m5rc6\" (UID: \"73928a0d-7a97-4c03-a5e8-6ab37119261c\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403942 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-serving-cert\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403977 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xgch\" (UniqueName: \"kubernetes.io/projected/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-kube-api-access-5xgch\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.403999 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7b12ebb-b568-4d15-abde-14db5041d5d2-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404028 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404058 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404088 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-serving-cert\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404123 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mkbf\" (UniqueName: \"kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404151 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-cabundle\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404623 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-webhook-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.404694 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:39.904677277 +0000 UTC m=+110.926978640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404785 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e67bda-2839-4364-9a75-54864090dc1f-config\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404814 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-images\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404959 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404980 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a1258288-8146-4cba-9d66-2a88e35a1fe9-tmpfs\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404984 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-stats-auth\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405058 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf866f04-6739-40da-8c1c-36d192472220-tmp-dir\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405077 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2512f5ef-a611-4637-b41f-41185def421b-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405096 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-client\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405134 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56318568-cab8-4d5b-9a20-4531fc8aad60-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405188 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-plugins-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405206 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eea7869-af21-4009-856f-65219d64ceea-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405225 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405243 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-certs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405266 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2512f5ef-a611-4637-b41f-41185def421b-config\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405286 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-socket-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405311 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4fhd\" (UniqueName: \"kubernetes.io/projected/1bb9e03f-ef85-4dbe-802f-d529e97b092c-kube-api-access-h4fhd\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405336 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crwp9\" (UniqueName: \"kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405363 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf866f04-6739-40da-8c1c-36d192472220-kube-api-access\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405388 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405410 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sg9xh\" (UniqueName: \"kubernetes.io/projected/65b5ebf3-054c-4827-96b2-7ea0a26f20af-kube-api-access-sg9xh\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405427 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-config\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405453 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405476 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405502 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-tmp-dir\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405527 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e67bda-2839-4364-9a75-54864090dc1f-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405585 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcvxk\" (UniqueName: \"kubernetes.io/projected/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-kube-api-access-mcvxk\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405836 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405863 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd443947-7241-49e3-9d98-f55329818dcc-webhook-certs\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405882 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2512f5ef-a611-4637-b41f-41185def421b-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405898 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-key\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405921 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65b5ebf3-054c-4827-96b2-7ea0a26f20af-metrics-tls\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.405972 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rjsf\" (UniqueName: \"kubernetes.io/projected/93e67bda-2839-4364-9a75-54864090dc1f-kube-api-access-4rjsf\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.406597 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/56318568-cab8-4d5b-9a20-4531fc8aad60-tmp\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.407400 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0eea7869-af21-4009-856f-65219d64ceea-mcc-auth-proxy-config\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.407586 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.408035 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-tmpfs\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.408139 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-config\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.408461 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-registration-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.404626 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7b12ebb-b568-4d15-abde-14db5041d5d2-trusted-ca\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.409000 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-service-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.409082 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-mountpoint-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.409975 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq7tp\" (UniqueName: \"kubernetes.io/projected/bd9a9e58-4363-4eaf-a27c-21aacaeea0a4-kube-api-access-tq7tp\") pod \"machine-approver-54c688565-5drp9\" (UID: \"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4\") " pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.410580 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-tmp-dir\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.410810 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7b12ebb-b568-4d15-abde-14db5041d5d2-metrics-tls\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.410953 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93e67bda-2839-4364-9a75-54864090dc1f-config\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.410999 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf866f04-6739-40da-8c1c-36d192472220-serving-cert\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.411156 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-images\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.411805 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-cabundle\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.411951 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-auth-proxy-config\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.412016 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-ca\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.412347 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-socket-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.412394 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65b5ebf3-054c-4827-96b2-7ea0a26f20af-tmp-dir\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.412949 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.413025 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/fe48c9c2-8783-475b-a961-d5a4110cb452-tmpfs\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.413255 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf866f04-6739-40da-8c1c-36d192472220-config\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.413418 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2512f5ef-a611-4637-b41f-41185def421b-tmp-dir\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414014 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-plugins-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414084 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-profile-collector-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.413572 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1bb9e03f-ef85-4dbe-802f-d529e97b092c-csi-data-dir\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414410 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56318568-cab8-4d5b-9a20-4531fc8aad60-config\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414432 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf866f04-6739-40da-8c1c-36d192472220-tmp-dir\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414453 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2512f5ef-a611-4637-b41f-41185def421b-serving-cert\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414667 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00dffd10-d567-431f-8dd9-390443f26d96-service-ca-bundle\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414714 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-tmp-dir\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.414957 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2512f5ef-a611-4637-b41f-41185def421b-config\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.415870 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-default-certificate\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.416436 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-stats-auth\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.417234 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.418001 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-etcd-client\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.418262 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93e67bda-2839-4364-9a75-54864090dc1f-serving-cert\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.419830 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7xbb\" (UniqueName: \"kubernetes.io/projected/9884d9ba-fbeb-40db-8105-de302262478b-kube-api-access-c7xbb\") pod \"machine-api-operator-755bb95488-w2nq2\" (UID: \"9884d9ba-fbeb-40db-8105-de302262478b\") " pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.419920 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56318568-cab8-4d5b-9a20-4531fc8aad60-serving-cert\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.420236 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.420710 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-serving-cert\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.420814 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-srv-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.420837 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0eea7869-af21-4009-856f-65219d64ceea-proxy-tls\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.421361 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-apiservice-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.421551 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00dffd10-d567-431f-8dd9-390443f26d96-metrics-certs\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.422109 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a1258288-8146-4cba-9d66-2a88e35a1fe9-profile-collector-cert\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.422213 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-proxy-tls\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.422304 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fe48c9c2-8783-475b-a961-d5a4110cb452-srv-cert\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.422513 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bd443947-7241-49e3-9d98-f55329818dcc-webhook-certs\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.422964 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65b5ebf3-054c-4827-96b2-7ea0a26f20af-metrics-tls\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.423013 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-webhook-cert\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.423638 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-package-server-manager-serving-cert\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.428596 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.433875 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-signing-key\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.472187 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.481368 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9648k\" (UniqueName: \"kubernetes.io/projected/bac29c51-c815-4827-bd3d-c74f2e31f842-kube-api-access-9648k\") pod \"authentication-operator-7f5c659b84-4vmb4\" (UID: \"bac29c51-c815-4827-bd3d-c74f2e31f842\") " pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.484316 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pngr\" (UniqueName: \"kubernetes.io/projected/68cdf6e7-fccd-4375-9688-7a2bcbefd82f-kube-api-access-4pngr\") pod \"apiserver-8596bd845d-f59q2\" (UID: \"68cdf6e7-fccd-4375-9688-7a2bcbefd82f\") " pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.484745 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.485419 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9lrd\" (UniqueName: \"kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd\") pod \"image-pruner-29568960-tjk88\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.488271 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" Mar 22 00:10:39 crc kubenswrapper[5116]: W0322 00:10:39.496068 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda442ae21_7eff_4990_998f_27afcb839a6c.slice/crio-41b30eca4ad92a4f782208036b6ab2600632c2f03500e7afa53c584f6c389d34 WatchSource:0}: Error finding container 41b30eca4ad92a4f782208036b6ab2600632c2f03500e7afa53c584f6c389d34: Status 404 returned error can't find the container with id 41b30eca4ad92a4f782208036b6ab2600632c2f03500e7afa53c584f6c389d34 Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.496836 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp728\" (UniqueName: \"kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728\") pod \"route-controller-manager-776cdc94d6-fw5k5\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.499416 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-tls\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.507631 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.507979 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.007966374 +0000 UTC m=+111.030267747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.512634 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.519218 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-dockercfg-gnx66\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.519331 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.539956 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.554712 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.561786 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.563749 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-config\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.566824 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-serving-cert\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.579933 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bjqfd\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.585531 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.600620 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.606515 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.608304 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.608900 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.108884717 +0000 UTC m=+111.131186090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.621338 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.622769 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.640476 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-dockercfg-2cfkp\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.669227 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"marketplace-trusted-ca\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.673073 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-9ddfb9f55-m5dds"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.682705 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.683204 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-metrics\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.688665 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.699963 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.705617 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.710598 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.710962 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.210947576 +0000 UTC m=+111.233248949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.722400 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.735275 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-755bb95488-w2nq2"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.744884 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-dockercfg-dzw6b\"" Mar 22 00:10:39 crc kubenswrapper[5116]: W0322 00:10:39.756213 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9884d9ba_fbeb_40db_8105_de302262478b.slice/crio-0d2a81b78273f088b81197ca78ec22abc78cc5fbdc1b3349477a5158392205a8 WatchSource:0}: Error finding container 0d2a81b78273f088b81197ca78ec22abc78cc5fbdc1b3349477a5158392205a8: Status 404 returned error can't find the container with id 0d2a81b78273f088b81197ca78ec22abc78cc5fbdc1b3349477a5158392205a8 Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.758413 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.759143 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"node-bootstrapper-token\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.772878 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-node-bootstrap-token\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.782976 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-tls\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.787909 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-747b44746d-cb5p2"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.796737 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dc9230ca-3a51-4ee3-976c-38c27605db87-certs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.803575 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pgs7\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.811930 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.812876 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.312855849 +0000 UTC m=+111.335157222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.827415 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.838072 5116 request.go:752] "Waited before sending request" delay="1.92430841s" reason="client-side throttling, not priority and fairness" verb="GET" URL="https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.839205 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-cert\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.841761 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.875652 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.879457 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.899560 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.902854 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.903645 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:39 crc kubenswrapper[5116]: W0322 00:10:39.910670 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbac29c51_c815_4827_bd3d_c74f2e31f842.slice/crio-d4ef373b94d181afedea5dabd5fee414987f702b031e6137da1403a3e31e3f9b WatchSource:0}: Error finding container d4ef373b94d181afedea5dabd5fee414987f702b031e6137da1403a3e31e3f9b: Status 404 returned error can't find the container with id d4ef373b94d181afedea5dabd5fee414987f702b031e6137da1403a3e31e3f9b Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.914384 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:39 crc kubenswrapper[5116]: E0322 00:10:39.914679 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.41466891 +0000 UTC m=+111.436970283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.918183 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.923614 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"hostpath-provisioner\"/\"csi-hostpath-provisioner-sa-dockercfg-7dcws\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.938989 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"kube-root-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.947053 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29568960-tjk88"] Mar 22 00:10:39 crc kubenswrapper[5116]: W0322 00:10:39.956535 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f51f3b4_6887_42b5_ad77_5a2f349a162a.slice/crio-b7695e77d66bbd117e7c0ab58ba5188e43f6d887ec5e97574683b7c2f512fe56 WatchSource:0}: Error finding container b7695e77d66bbd117e7c0ab58ba5188e43f6d887ec5e97574683b7c2f512fe56: Status 404 returned error can't find the container with id b7695e77d66bbd117e7c0ab58ba5188e43f6d887ec5e97574683b7c2f512fe56 Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.958922 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"openshift-service-ca.crt\"" Mar 22 00:10:39 crc kubenswrapper[5116]: W0322 00:10:39.968897 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b3ade2_2521_43a2_a5fc_2c33d19f3a58.slice/crio-82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c WatchSource:0}: Error finding container 82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c: Status 404 returned error can't find the container with id 82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.979208 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-sysctl-allowlist\"" Mar 22 00:10:39 crc kubenswrapper[5116]: I0322 00:10:39.986042 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.000532 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.007210 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-config-volume\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.015924 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.016085 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.516066928 +0000 UTC m=+111.538368301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.016306 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.016718 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.516710347 +0000 UTC m=+111.539011720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.019718 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kpvmz\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.021068 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-8596bd845d-f59q2"] Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.039870 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.048218 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-metrics-tls\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.062924 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.079710 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.099253 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.117487 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.117876 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.617853887 +0000 UTC m=+111.640155260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.120008 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.174254 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" event={"ID":"5f51f3b4-6887-42b5-ad77-5a2f349a162a","Type":"ContainerStarted","Data":"b7695e77d66bbd117e7c0ab58ba5188e43f6d887ec5e97574683b7c2f512fe56"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.176002 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29568960-tjk88" event={"ID":"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58","Type":"ContainerStarted","Data":"82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.178039 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl2cn\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.178120 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-747b44746d-cb5p2" event={"ID":"45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0","Type":"ContainerStarted","Data":"8d50655ab10e679cee022ffa16220973ee55f67aaf4152a787d4b6f4d6b3136d"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.178158 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-747b44746d-cb5p2" event={"ID":"45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0","Type":"ContainerStarted","Data":"fdd8b565275a73869aba0dace482f6ad395d7c9d13ffd827c4dfd53f39767e96"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.179699 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" event={"ID":"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513","Type":"ContainerStarted","Data":"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.179732 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" event={"ID":"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513","Type":"ContainerStarted","Data":"4a8a88fe9fa050abb0479c637d1e4e232ab389aa2a939c4d9f3135fe99408731"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.182566 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" event={"ID":"a442ae21-7eff-4990-998f-27afcb839a6c","Type":"ContainerStarted","Data":"7556a74f62cfa03d7d6877cf0ac7b3fcefce33d8a21efd30fa182f4e53aac22e"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.182620 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" event={"ID":"a442ae21-7eff-4990-998f-27afcb839a6c","Type":"ContainerStarted","Data":"41b30eca4ad92a4f782208036b6ab2600632c2f03500e7afa53c584f6c389d34"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.187880 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" event={"ID":"68cdf6e7-fccd-4375-9688-7a2bcbefd82f","Type":"ContainerStarted","Data":"c4605d831a0d6b573d011919b36267673e4cb38084bde38f6c891ba555f51a65"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.191895 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" event={"ID":"9884d9ba-fbeb-40db-8105-de302262478b","Type":"ContainerStarted","Data":"187e6738540da05e067ea423b98d4efd3d217d04e0f2ac6b5c6cc37914224616"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.191934 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" event={"ID":"9884d9ba-fbeb-40db-8105-de302262478b","Type":"ContainerStarted","Data":"0d2a81b78273f088b81197ca78ec22abc78cc5fbdc1b3349477a5158392205a8"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.194244 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" event={"ID":"94c19a90-c2c9-4236-98be-a0516dbb840b","Type":"ContainerStarted","Data":"5da9c7d414944f4328a1119aa21269f4a0ac5f86587a539468ca57ad8f87b24f"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.194268 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" event={"ID":"94c19a90-c2c9-4236-98be-a0516dbb840b","Type":"ContainerStarted","Data":"95b886b5476f7b77861e9a6b7b021397c333f535a09578ba08ef153f2832cd25"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.195430 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" event={"ID":"bac29c51-c815-4827-bd3d-c74f2e31f842","Type":"ContainerStarted","Data":"d4ef373b94d181afedea5dabd5fee414987f702b031e6137da1403a3e31e3f9b"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.196912 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qljht\" (UniqueName: \"kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht\") pod \"oauth-openshift-66458b6674-8qfhd\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.197906 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" event={"ID":"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4","Type":"ContainerStarted","Data":"42db71085552419312922c644d5987206fd0d0ef12c21f3a93fedd2cdd98df1d"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.198003 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" event={"ID":"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4","Type":"ContainerStarted","Data":"287c0d5e6ba5b835ad3f5bd309bb9680d421b30fd9be8e8a568092d6a603afd7"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.201325 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" event={"ID":"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0","Type":"ContainerStarted","Data":"701d96edd356dab1e9a8ae15bd01aef0cb8c6d8c29074e7d24c96c16a861b673"} Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.219288 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.219963 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.719941707 +0000 UTC m=+111.742243150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.219973 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkpc6\" (UniqueName: \"kubernetes.io/projected/44b1188c-0fa6-48c7-bf76-6e65ca8174ec-kube-api-access-xkpc6\") pod \"openshift-config-operator-5777786469-wb6r8\" (UID: \"44b1188c-0fa6-48c7-bf76-6e65ca8174ec\") " pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.239945 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsw9z\" (UniqueName: \"kubernetes.io/projected/4c2755ce-817d-47b0-9f19-7218641d0c5b-kube-api-access-zsw9z\") pod \"console-64d44f6ddf-9g5sg\" (UID: \"4c2755ce-817d-47b0-9f19-7218641d0c5b\") " pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.248448 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.255902 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-bound-sa-token\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.277207 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtlfq\" (UniqueName: \"kubernetes.io/projected/3bf5ae18-6e08-436b-939f-03347eda68a8-kube-api-access-wtlfq\") pod \"cluster-samples-operator-6b564684c8-dfx6t\" (UID: \"3bf5ae18-6e08-436b-939f-03347eda68a8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.301551 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqc8c\" (UniqueName: \"kubernetes.io/projected/720e1d69-81b3-4fdb-94c1-dabb0707c833-kube-api-access-zqc8c\") pod \"cluster-image-registry-operator-86c45576b9-qrgfg\" (UID: \"720e1d69-81b3-4fdb-94c1-dabb0707c833\") " pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.301918 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.307634 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.315197 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv46d\" (UniqueName: \"kubernetes.io/projected/413bd8dc-5257-4fd7-95c1-01f6d79278ee-kube-api-access-tv46d\") pod \"openshift-controller-manager-operator-686468bdd5-pdqfv\" (UID: \"413bd8dc-5257-4fd7-95c1-01f6d79278ee\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.321344 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.321737 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.821708005 +0000 UTC m=+111.844009378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.325890 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.326483 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.826469247 +0000 UTC m=+111.848770630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.335330 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95r42\" (UniqueName: \"kubernetes.io/projected/b4117709-89bd-4e72-8016-0c25c0ece2c6-kube-api-access-95r42\") pod \"console-operator-67c89758df-vnd4f\" (UID: \"b4117709-89bd-4e72-8016-0c25c0ece2c6\") " pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.369130 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.377861 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktdtc\" (UniqueName: \"kubernetes.io/projected/fe48c9c2-8783-475b-a961-d5a4110cb452-kube-api-access-ktdtc\") pod \"olm-operator-5cdf44d969-cp4p2\" (UID: \"fe48c9c2-8783-475b-a961-d5a4110cb452\") " pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.397364 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rjsf\" (UniqueName: \"kubernetes.io/projected/93e67bda-2839-4364-9a75-54864090dc1f-kube-api-access-4rjsf\") pod \"kube-storage-version-migrator-operator-565b79b866-tlh4v\" (UID: \"93e67bda-2839-4364-9a75-54864090dc1f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.409431 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.422740 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mkbf\" (UniqueName: \"kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf\") pod \"cni-sysctl-allowlist-ds-57nbs\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.427307 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.427439 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.92742261 +0000 UTC m=+111.949723983 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.427566 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.427907 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:40.927900356 +0000 UTC m=+111.950201729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.440102 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xgch\" (UniqueName: \"kubernetes.io/projected/5df5e2ea-4661-49c7-95f2-d8c039bbea5d-kube-api-access-5xgch\") pod \"service-ca-operator-5b9c976747-zvprn\" (UID: \"5df5e2ea-4661-49c7-95f2-d8c039bbea5d\") " pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.460649 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.465105 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxs7g\" (UniqueName: \"kubernetes.io/projected/bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4-kube-api-access-hxs7g\") pod \"etcd-operator-69b85846b6-l4jwl\" (UID: \"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4\") " pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.477435 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.499255 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgc4\" (UniqueName: \"kubernetes.io/projected/50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee-kube-api-access-psgc4\") pod \"machine-config-operator-67c9d58cbb-l9f6f\" (UID: \"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee\") " pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.500837 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-5777786469-wb6r8"] Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.502043 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/56318568-cab8-4d5b-9a20-4531fc8aad60-kube-api-access\") pod \"openshift-kube-scheduler-operator-54f497555d-lmct6\" (UID: \"56318568-cab8-4d5b-9a20-4531fc8aad60\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.508568 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.526620 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-bound-sa-token\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.528763 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.529187 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.529427 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.029410826 +0000 UTC m=+112.051712199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.536794 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.539116 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xp2b\" (UniqueName: \"kubernetes.io/projected/dcd47220-17b3-4593-9a12-fb67c0c0dcc8-kube-api-access-6xp2b\") pod \"ingress-canary-55dml\" (UID: \"dcd47220-17b3-4593-9a12-fb67c0c0dcc8\") " pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.552629 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg"] Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.555601 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqx7\" (UniqueName: \"kubernetes.io/projected/a224637e-e693-4ae7-89c3-1a01e6c9a6f5-kube-api-access-9vqx7\") pod \"packageserver-7d4fc7d867-td5gr\" (UID: \"a224637e-e693-4ae7-89c3-1a01e6c9a6f5\") " pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.560411 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.560470 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.573652 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzqjt\" (UniqueName: \"kubernetes.io/projected/fc0602bb-5acd-426a-b3d6-3a2effb49bf3-kube-api-access-gzqjt\") pod \"package-server-manager-77f986bd66-ms24k\" (UID: \"fc0602bb-5acd-426a-b3d6-3a2effb49bf3\") " pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.592757 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-8qfhd"] Mar 22 00:10:40 crc kubenswrapper[5116]: W0322 00:10:40.598199 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b1188c_0fa6_48c7_bf76_6e65ca8174ec.slice/crio-d1e144ef7003c7250e36bbf914ac057a66e10f25b9b8f14f72333604bf394adb WatchSource:0}: Error finding container d1e144ef7003c7250e36bbf914ac057a66e10f25b9b8f14f72333604bf394adb: Status 404 returned error can't find the container with id d1e144ef7003c7250e36bbf914ac057a66e10f25b9b8f14f72333604bf394adb Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.598615 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lck5k\" (UniqueName: \"kubernetes.io/projected/a1258288-8146-4cba-9d66-2a88e35a1fe9-kube-api-access-lck5k\") pod \"catalog-operator-75ff9f647d-47j6l\" (UID: \"a1258288-8146-4cba-9d66-2a88e35a1fe9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:40 crc kubenswrapper[5116]: W0322 00:10:40.600302 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod720e1d69_81b3_4fdb_94c1_dabb0707c833.slice/crio-21093a88dfe98cf8a3593ef423b3139b9e015df32f65c9a38c0fac090f1bfe31 WatchSource:0}: Error finding container 21093a88dfe98cf8a3593ef423b3139b9e015df32f65c9a38c0fac090f1bfe31: Status 404 returned error can't find the container with id 21093a88dfe98cf8a3593ef423b3139b9e015df32f65c9a38c0fac090f1bfe31 Mar 22 00:10:40 crc kubenswrapper[5116]: W0322 00:10:40.604385 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73ebea9b_fc7b_4d54_af53_f6f61e0fce97.slice/crio-541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb WatchSource:0}: Error finding container 541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb: Status 404 returned error can't find the container with id 541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.615803 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttvwk\" (UniqueName: \"kubernetes.io/projected/c7b12ebb-b568-4d15-abde-14db5041d5d2-kube-api-access-ttvwk\") pod \"ingress-operator-6b9cb4dbcf-qdtdp\" (UID: \"c7b12ebb-b568-4d15-abde-14db5041d5d2\") " pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.630965 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.631350 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.131334251 +0000 UTC m=+112.153635624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.633795 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkbsf\" (UniqueName: \"kubernetes.io/projected/bd4c0dd5-6c39-4661-b0c6-424d6b061f04-kube-api-access-fkbsf\") pod \"service-ca-74545575db-z8df4\" (UID: \"bd4c0dd5-6c39-4661-b0c6-424d6b061f04\") " pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.639795 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.654460 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clsgn\" (UniqueName: \"kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn\") pod \"collect-profiles-29568960-fmcxl\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.662941 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.675798 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bmdw\" (UniqueName: \"kubernetes.io/projected/00dffd10-d567-431f-8dd9-390443f26d96-kube-api-access-8bmdw\") pod \"router-default-68cf44c8b8-2jlxw\" (UID: \"00dffd10-d567-431f-8dd9-390443f26d96\") " pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.677355 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:40 crc kubenswrapper[5116]: W0322 00:10:40.693440 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d2a94f_b4d4_4cdc_b862_a4866cadaea1.slice/crio-03b66e449ae75da9ca6d01567116f2bb706bdaf3b8c1863bf017db47b855716d WatchSource:0}: Error finding container 03b66e449ae75da9ca6d01567116f2bb706bdaf3b8c1863bf017db47b855716d: Status 404 returned error can't find the container with id 03b66e449ae75da9ca6d01567116f2bb706bdaf3b8c1863bf017db47b855716d Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.716895 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6g2r\" (UniqueName: \"kubernetes.io/projected/73928a0d-7a97-4c03-a5e8-6ab37119261c-kube-api-access-p6g2r\") pod \"migrator-866fcbc849-m5rc6\" (UID: \"73928a0d-7a97-4c03-a5e8-6ab37119261c\") " pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.722194 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prql2\" (UniqueName: \"kubernetes.io/projected/0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a-kube-api-access-prql2\") pod \"dns-default-rkswl\" (UID: \"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a\") " pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.732744 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.733417 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.233395199 +0000 UTC m=+112.255696572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.734715 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.747315 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.747403 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg9xh\" (UniqueName: \"kubernetes.io/projected/65b5ebf3-054c-4827-96b2-7ea0a26f20af-kube-api-access-sg9xh\") pod \"dns-operator-799b87ffcd-42tp2\" (UID: \"65b5ebf3-054c-4827-96b2-7ea0a26f20af\") " pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.755253 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.763916 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.772078 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crwp9\" (UniqueName: \"kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9\") pod \"marketplace-operator-547dbd544d-lf2zm\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.781544 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcvxk\" (UniqueName: \"kubernetes.io/projected/0910a8a1-0226-42c8-ab1d-b142d2b7a00d-kube-api-access-mcvxk\") pod \"control-plane-machine-set-operator-75ffdb6fcd-m65lb\" (UID: \"0910a8a1-0226-42c8-ab1d-b142d2b7a00d\") " pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.781797 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.791191 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-74545575db-z8df4" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.800437 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.802239 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf2fs\" (UniqueName: \"kubernetes.io/projected/dc9230ca-3a51-4ee3-976c-38c27605db87-kube-api-access-lf2fs\") pod \"machine-config-server-r75lk\" (UID: \"dc9230ca-3a51-4ee3-976c-38c27605db87\") " pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.820461 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-r75lk" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.820544 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.828647 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-55dml" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.834272 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.834696 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.334677753 +0000 UTC m=+112.356979126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.835095 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.835387 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf866f04-6739-40da-8c1c-36d192472220-kube-api-access\") pod \"kube-apiserver-operator-575994946d-jxbnl\" (UID: \"cf866f04-6739-40da-8c1c-36d192472220\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.842929 5116 request.go:752] "Waited before sending request" delay="1.429325464s" reason="client-side throttling, not priority and fairness" verb="POST" URL="https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/serviceaccounts/multus-ac/token" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.845663 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4fhd\" (UniqueName: \"kubernetes.io/projected/1bb9e03f-ef85-4dbe-802f-d529e97b092c-kube-api-access-h4fhd\") pod \"csi-hostpathplugin-mzck5\" (UID: \"1bb9e03f-ef85-4dbe-802f-d529e97b092c\") " pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.852483 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.859695 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d44f6ddf-9g5sg"] Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.867601 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52j77\" (UniqueName: \"kubernetes.io/projected/bd443947-7241-49e3-9d98-f55329818dcc-kube-api-access-52j77\") pod \"multus-admission-controller-69db94689b-bkst6\" (UID: \"bd443947-7241-49e3-9d98-f55329818dcc\") " pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.872514 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.878510 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcwsm\" (UniqueName: \"kubernetes.io/projected/0eea7869-af21-4009-856f-65219d64ceea-kube-api-access-rcwsm\") pod \"machine-config-controller-f9cdd68f7-wfffg\" (UID: \"0eea7869-af21-4009-856f-65219d64ceea\") " pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.897754 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2512f5ef-a611-4637-b41f-41185def421b-kube-api-access\") pod \"kube-controller-manager-operator-69d5f845f8-sk5s4\" (UID: \"2512f5ef-a611-4637-b41f-41185def421b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.925074 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.931386 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.935541 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:40 crc kubenswrapper[5116]: E0322 00:10:40.935850 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.435834073 +0000 UTC m=+112.458135446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.945693 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.955018 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" Mar 22 00:10:40 crc kubenswrapper[5116]: I0322 00:10:40.993342 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.023713 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.036605 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.036897 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.53688617 +0000 UTC m=+112.559187533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: W0322 00:10:41.113606 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00dffd10_d567_431f_8dd9_390443f26d96.slice/crio-5c5f21dd0fb1f5cca22c0f6633d75243cf104676fd61cc14c483a6ebe77be37f WatchSource:0}: Error finding container 5c5f21dd0fb1f5cca22c0f6633d75243cf104676fd61cc14c483a6ebe77be37f: Status 404 returned error can't find the container with id 5c5f21dd0fb1f5cca22c0f6633d75243cf104676fd61cc14c483a6ebe77be37f Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.127007 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv"] Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.137813 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.137884 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.637866644 +0000 UTC m=+112.660168027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.138196 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.138546 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.638528965 +0000 UTC m=+112.660830328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.207903 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" event={"ID":"5f51f3b4-6887-42b5-ad77-5a2f349a162a","Type":"ContainerStarted","Data":"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.208200 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.208904 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-r75lk" event={"ID":"dc9230ca-3a51-4ee3-976c-38c27605db87","Type":"ContainerStarted","Data":"efffb0ad943198a07230309dd53c994c1411da589dbe123311cafbb9e5fa0172"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.210430 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" event={"ID":"9884d9ba-fbeb-40db-8105-de302262478b","Type":"ContainerStarted","Data":"9bfbd0761ac53ac1129a6e95695fa17cb6457e56c7d321788eea8e02c8759ed4"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.213708 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wlq8c" event={"ID":"94c19a90-c2c9-4236-98be-a0516dbb840b","Type":"ContainerStarted","Data":"94524cbda17a754b2c962f246991ddd1a68e24324f4f27eb2c33b803e8cab1ef"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.218227 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" event={"ID":"44b1188c-0fa6-48c7-bf76-6e65ca8174ec","Type":"ContainerStarted","Data":"d1e144ef7003c7250e36bbf914ac057a66e10f25b9b8f14f72333604bf394adb"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.219658 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" event={"ID":"bac29c51-c815-4827-bd3d-c74f2e31f842","Type":"ContainerStarted","Data":"28d79fc4790c4f02ffe53e980b29154ae8315112a1d282830854ede66cb399e2"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.221818 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" event={"ID":"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1","Type":"ContainerStarted","Data":"03b66e449ae75da9ca6d01567116f2bb706bdaf3b8c1863bf017db47b855716d"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.230425 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" event={"ID":"720e1d69-81b3-4fdb-94c1-dabb0707c833","Type":"ContainerStarted","Data":"21093a88dfe98cf8a3593ef423b3139b9e015df32f65c9a38c0fac090f1bfe31"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.233325 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" event={"ID":"73ebea9b-fc7b-4d54-af53-f6f61e0fce97","Type":"ContainerStarted","Data":"541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.235229 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29568960-tjk88" event={"ID":"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58","Type":"ContainerStarted","Data":"1d4ffcdcf7f3c1ceaea89d564eaa99a17f05170fe4b5407ce1655d672a0e5a4e"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.237260 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" event={"ID":"00dffd10-d567-431f-8dd9-390443f26d96","Type":"ContainerStarted","Data":"5c5f21dd0fb1f5cca22c0f6633d75243cf104676fd61cc14c483a6ebe77be37f"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.239027 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.239403 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.739361275 +0000 UTC m=+112.761662658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.240942 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.240947 5116 generic.go:358] "Generic (PLEG): container finished" podID="68cdf6e7-fccd-4375-9688-7a2bcbefd82f" containerID="38b12cab203858c4c797989bc1b13b2ea864137e8626e042e66c979fb3b26066" exitCode=0 Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.240976 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" event={"ID":"68cdf6e7-fccd-4375-9688-7a2bcbefd82f","Type":"ContainerDied","Data":"38b12cab203858c4c797989bc1b13b2ea864137e8626e042e66c979fb3b26066"} Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.241440 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.741363198 +0000 UTC m=+112.763664651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.244282 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-9g5sg" event={"ID":"4c2755ce-817d-47b0-9f19-7218641d0c5b","Type":"ContainerStarted","Data":"dbec98cf4e43e15ff41d5a49916dbfd9285ace92f1c8184b77f7389222a14b01"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.247054 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" event={"ID":"bd9a9e58-4363-4eaf-a27c-21aacaeea0a4","Type":"ContainerStarted","Data":"89bebe35c6dbb2c0b52a587df591e5f99263e1302c9ca5663a1b8462c04e9fb6"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.250003 5116 generic.go:358] "Generic (PLEG): container finished" podID="8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0" containerID="aebe47f61ff1b229e280099427e2808f24b0724cc7faa10d582dbd873dfaedd3" exitCode=0 Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.250104 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" event={"ID":"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0","Type":"ContainerDied","Data":"aebe47f61ff1b229e280099427e2808f24b0724cc7faa10d582dbd873dfaedd3"} Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.278724 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.278924 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.284881 5116 patch_prober.go:28] interesting pod/route-controller-manager-776cdc94d6-fw5k5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.284961 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.285404 5116 patch_prober.go:28] interesting pod/controller-manager-65b6cccf98-9kdkj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.285470 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.299670 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.299745 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.344301 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.346357 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.846336289 +0000 UTC m=+112.868637662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.410202 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-7f5c659b84-4vmb4" podStartSLOduration=91.410124514 podStartE2EDuration="1m31.410124514s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:41.351098181 +0000 UTC m=+112.373399554" watchObservedRunningTime="2026-03-22 00:10:41.410124514 +0000 UTC m=+112.432425897" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.446389 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.446794 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:41.946777027 +0000 UTC m=+112.969078400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.469044 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-755bb95488-w2nq2" podStartSLOduration=91.469030673 podStartE2EDuration="1m31.469030673s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:41.437936146 +0000 UTC m=+112.460237539" watchObservedRunningTime="2026-03-22 00:10:41.469030673 +0000 UTC m=+112.491332046" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.547415 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.548313 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.048291287 +0000 UTC m=+113.070592660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.653352 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.653683 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.153669812 +0000 UTC m=+113.175971185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.719138 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wlq8c" podStartSLOduration=91.719118928 podStartE2EDuration="1m31.719118928s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:41.715260986 +0000 UTC m=+112.737562379" watchObservedRunningTime="2026-03-22 00:10:41.719118928 +0000 UTC m=+112.741420321" Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.754730 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.754919 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.254882453 +0000 UTC m=+113.277183826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.755287 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.755662 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.255653337 +0000 UTC m=+113.277954710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.857283 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.857875 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.357855731 +0000 UTC m=+113.380157114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:41 crc kubenswrapper[5116]: I0322 00:10:41.960323 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:41 crc kubenswrapper[5116]: E0322 00:10:41.961016 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.460998833 +0000 UTC m=+113.483300206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.061697 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.062194 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.562176924 +0000 UTC m=+113.584478297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.107607 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.118769 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.138119 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2"] Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.147390 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe48c9c2_8783_475b_a961_d5a4110cb452.slice/crio-f64f0c7f6380e0bbf5ee6de764ef3f133abf565b5e25e6d3558e99c1a0b86486 WatchSource:0}: Error finding container f64f0c7f6380e0bbf5ee6de764ef3f133abf565b5e25e6d3558e99c1a0b86486: Status 404 returned error can't find the container with id f64f0c7f6380e0bbf5ee6de764ef3f133abf565b5e25e6d3558e99c1a0b86486 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.163962 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.164710 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.664696327 +0000 UTC m=+113.686997700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.264947 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.265376 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.765353721 +0000 UTC m=+113.787655094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.275733 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-r75lk" event={"ID":"dc9230ca-3a51-4ee3-976c-38c27605db87","Type":"ContainerStarted","Data":"bd24c60ce4d85de35c70b0f8332cafa2c6cb13a90f7c511a6a9293a96a00baf1"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.282795 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-846cbfc458-vdh4n" podStartSLOduration=93.282770213 podStartE2EDuration="1m33.282770213s" podCreationTimestamp="2026-03-22 00:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:42.274724078 +0000 UTC m=+113.297025461" watchObservedRunningTime="2026-03-22 00:10:42.282770213 +0000 UTC m=+113.305071596" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.311997 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" event={"ID":"720e1d69-81b3-4fdb-94c1-dabb0707c833","Type":"ContainerStarted","Data":"73a12da3b3bc7773cfec2bcaf6ea449dce19d7297c195ed650b9612c81288a8e"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.322538 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.332081 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" event={"ID":"fe48c9c2-8783-475b-a961-d5a4110cb452","Type":"ContainerStarted","Data":"f64f0c7f6380e0bbf5ee6de764ef3f133abf565b5e25e6d3558e99c1a0b86486"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.336485 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.354882 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.363385 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.368025 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.368943 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.868930437 +0000 UTC m=+113.891231810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.378500 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" event={"ID":"413bd8dc-5257-4fd7-95c1-01f6d79278ee","Type":"ContainerStarted","Data":"723176f625a94af34283d0f388978d0c12b4c1bf21948c21dc6d8168fb69989f"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.379058 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rkswl"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.390380 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" event={"ID":"00dffd10-d567-431f-8dd9-390443f26d96","Type":"ContainerStarted","Data":"1760d0cb129bddc97f7c1c542eab1158bf68b95b799269e9acc9c36272c6d24e"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.405907 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" event={"ID":"93e67bda-2839-4364-9a75-54864090dc1f","Type":"ContainerStarted","Data":"c748392e8bdddae09e73c1aec7abac3ee2017ebac04d6f15c62d3c0ad4fc8ea2"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.429012 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-74545575db-z8df4"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.437073 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d44f6ddf-9g5sg" event={"ID":"4c2755ce-817d-47b0-9f19-7218641d0c5b","Type":"ContainerStarted","Data":"d31637ad8e7ff373cab181373fc2ece79b732258d319b32c6a963d0b07a6c56a"} Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.450580 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-67c89758df-vnd4f"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.451230 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-55dml"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.466710 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.469488 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.470834 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:42.970816461 +0000 UTC m=+113.993117834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.489151 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd4c0dd5_6c39_4661_b0c6_424d6b061f04.slice/crio-a832625cfa0643de09d04cd2785bd616383620f4f27e61a8788477cb8e277991 WatchSource:0}: Error finding container a832625cfa0643de09d04cd2785bd616383620f4f27e61a8788477cb8e277991: Status 404 returned error can't find the container with id a832625cfa0643de09d04cd2785bd616383620f4f27e61a8788477cb8e277991 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.494068 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55450: no serving certificate available for the kubelet" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.499001 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.499143 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.511694 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4117709_89bd_4e72_8016_0c25c0ece2c6.slice/crio-ee29cde6b79f9126ff7254db86af5e5422a58b97f994568cce87584f05b97cc0 WatchSource:0}: Error finding container ee29cde6b79f9126ff7254db86af5e5422a58b97f994568cce87584f05b97cc0: Status 404 returned error can't find the container with id ee29cde6b79f9126ff7254db86af5e5422a58b97f994568cce87584f05b97cc0 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.575878 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.576250 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.076233155 +0000 UTC m=+114.098534528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.610985 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55466: no serving certificate available for the kubelet" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.654905 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.658496 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" podStartSLOduration=92.658478925 podStartE2EDuration="1m32.658478925s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:42.657868776 +0000 UTC m=+113.680170149" watchObservedRunningTime="2026-03-22 00:10:42.658478925 +0000 UTC m=+113.680780298" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.673036 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.679727 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.680373 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.180353559 +0000 UTC m=+114.202654932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.689917 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.705641 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55468: no serving certificate available for the kubelet" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.723979 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-799b87ffcd-42tp2"] Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.728459 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1258288_8146_4cba_9d66_2a88e35a1fe9.slice/crio-9d8ced1bc6d028d7d194ab8a7ffc574ef4b83cb030688c5fdf0c163bedfc4755 WatchSource:0}: Error finding container 9d8ced1bc6d028d7d194ab8a7ffc574ef4b83cb030688c5fdf0c163bedfc4755: Status 404 returned error can't find the container with id 9d8ced1bc6d028d7d194ab8a7ffc574ef4b83cb030688c5fdf0c163bedfc4755 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.751938 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.777792 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.782388 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.782743 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.282720657 +0000 UTC m=+114.305022030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.803457 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.805867 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mzck5"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.810366 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-69db94689b-bkst6"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.811547 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55474: no serving certificate available for the kubelet" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.812730 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.819067 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.845585 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.866971 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl"] Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.870767 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.874773 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64d44f6ddf-9g5sg" podStartSLOduration=92.874752378 podStartE2EDuration="1m32.874752378s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:42.869235473 +0000 UTC m=+113.891536866" watchObservedRunningTime="2026-03-22 00:10:42.874752378 +0000 UTC m=+113.897053751" Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.880991 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2512f5ef_a611_4637_b41f_41185def421b.slice/crio-c850a5b15ebe369168442d8759d8d793df9b63902d501fe1424f01606db18a12 WatchSource:0}: Error finding container c850a5b15ebe369168442d8759d8d793df9b63902d501fe1424f01606db18a12: Status 404 returned error can't find the container with id c850a5b15ebe369168442d8759d8d793df9b63902d501fe1424f01606db18a12 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.886081 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.886283 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.386235832 +0000 UTC m=+114.408537195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.886886 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.887509 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.387491862 +0000 UTC m=+114.409793235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: W0322 00:10:42.888726 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113ffd3f_0faf_40f9_b1ab_0c7b88fc90f1.slice/crio-dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9 WatchSource:0}: Error finding container dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9: Status 404 returned error can't find the container with id dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9 Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.910253 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55490: no serving certificate available for the kubelet" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.942350 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.942401 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.942982 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.988554 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:42 crc kubenswrapper[5116]: E0322 00:10:42.989380 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.489357395 +0000 UTC m=+114.511658768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:42 crc kubenswrapper[5116]: I0322 00:10:42.995085 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-747b44746d-cb5p2" podStartSLOduration=92.995067216 podStartE2EDuration="1m32.995067216s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:42.991514233 +0000 UTC m=+114.013815626" watchObservedRunningTime="2026-03-22 00:10:42.995067216 +0000 UTC m=+114.017368599" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.043966 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55496: no serving certificate available for the kubelet" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.090725 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.091368 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.591356481 +0000 UTC m=+114.613657854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.131944 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-86c45576b9-qrgfg" podStartSLOduration=93.131921209 podStartE2EDuration="1m33.131921209s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.128786429 +0000 UTC m=+114.151087802" watchObservedRunningTime="2026-03-22 00:10:43.131921209 +0000 UTC m=+114.154222592" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.132317 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55502: no serving certificate available for the kubelet" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.136696 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29568960-tjk88" podStartSLOduration=94.136681279 podStartE2EDuration="1m34.136681279s" podCreationTimestamp="2026-03-22 00:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.056092582 +0000 UTC m=+114.078393955" watchObservedRunningTime="2026-03-22 00:10:43.136681279 +0000 UTC m=+114.158982652" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.149882 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" podStartSLOduration=93.149862857 podStartE2EDuration="1m33.149862857s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.147997358 +0000 UTC m=+114.170298741" watchObservedRunningTime="2026-03-22 00:10:43.149862857 +0000 UTC m=+114.172164230" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.166132 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-54c688565-5drp9" podStartSLOduration=95.166109353 podStartE2EDuration="1m35.166109353s" podCreationTimestamp="2026-03-22 00:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.163684366 +0000 UTC m=+114.185985739" watchObservedRunningTime="2026-03-22 00:10:43.166109353 +0000 UTC m=+114.188410726" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.181810 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.194917 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.195330 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.695309719 +0000 UTC m=+114.717611092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.197483 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podStartSLOduration=93.197465288 podStartE2EDuration="1m33.197465288s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.196476677 +0000 UTC m=+114.218778150" watchObservedRunningTime="2026-03-22 00:10:43.197465288 +0000 UTC m=+114.219766661" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.286544 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55510: no serving certificate available for the kubelet" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.299643 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.300434 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.800416834 +0000 UTC m=+114.822718207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.403474 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.403633 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.903591729 +0000 UTC m=+114.925893102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.404752 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.405210 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:43.90519777 +0000 UTC m=+114.927499143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.469426 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" event={"ID":"68cdf6e7-fccd-4375-9688-7a2bcbefd82f","Type":"ContainerStarted","Data":"aa58c1b4b605febfa81daefda67e613d9aa8f5b439bda030490238302b4fc814"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.476106 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" event={"ID":"a224637e-e693-4ae7-89c3-1a01e6c9a6f5","Type":"ContainerStarted","Data":"a8af25551d1890f12f0d47062adc0afaa25a755cba29680343f498bfb1020e02"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.489378 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" event={"ID":"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0","Type":"ContainerStarted","Data":"0ec1c1ec99b4ee4c79936b1b90ac85731d0a065a26aa6ac29f075ef6bdc7676a"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.491352 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" event={"ID":"cf866f04-6739-40da-8c1c-36d192472220","Type":"ContainerStarted","Data":"cfd4dc4ddc61db053d22358e0a3c01f31a90898bf77c1791f70ef6e6eb889660"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.493925 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" event={"ID":"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee","Type":"ContainerStarted","Data":"a3c9c27e9d2cc97bca372317fcf8d468530023234119b2a4fbe84defe8f015a8"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.496518 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" event={"ID":"0eea7869-af21-4009-856f-65219d64ceea","Type":"ContainerStarted","Data":"27260d6135cfacddcdfca7e2b5f9de8c645628459b662e97af029712da6761eb"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.498006 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" event={"ID":"2512f5ef-a611-4637-b41f-41185def421b","Type":"ContainerStarted","Data":"c850a5b15ebe369168442d8759d8d793df9b63902d501fe1424f01606db18a12"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.501769 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" event={"ID":"73928a0d-7a97-4c03-a5e8-6ab37119261c","Type":"ContainerStarted","Data":"dc3baf11f173de740ebc9e5638baaa55491861bf7fa3f9563de211a3294d2d8b"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.505791 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.505961 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.005928276 +0000 UTC m=+115.028229649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.506348 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.506796 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.006757403 +0000 UTC m=+115.029058776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.507567 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" event={"ID":"3bf5ae18-6e08-436b-939f-03347eda68a8","Type":"ContainerStarted","Data":"441a1789203e5ff9b4371c60983659510d2ce4e47c16b032b945bd3ef8532888"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.507629 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" event={"ID":"3bf5ae18-6e08-436b-939f-03347eda68a8","Type":"ContainerStarted","Data":"cb910c1ce7004c90694e7cb0f97d872914ab3ce97eecec864fde592b9d28f229"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.512898 5116 generic.go:358] "Generic (PLEG): container finished" podID="44b1188c-0fa6-48c7-bf76-6e65ca8174ec" containerID="e7b62cb886ebeacc39c91f1f6acfa8bb840da76b7903d79d0c7c1dfa267d370b" exitCode=0 Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.512962 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" event={"ID":"44b1188c-0fa6-48c7-bf76-6e65ca8174ec","Type":"ContainerDied","Data":"e7b62cb886ebeacc39c91f1f6acfa8bb840da76b7903d79d0c7c1dfa267d370b"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.518566 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" event={"ID":"5df5e2ea-4661-49c7-95f2-d8c039bbea5d","Type":"ContainerStarted","Data":"35d6628d6171d1a0e5f1525c2fbbc4417fe95a5014f9ba89b2cbd6024dce0301"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.518608 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" event={"ID":"5df5e2ea-4661-49c7-95f2-d8c039bbea5d","Type":"ContainerStarted","Data":"60a9ef36afbb727be95c8dc1084b633e0aac362b1138282f574e1f604d2ec0b7"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.520315 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" event={"ID":"fe48c9c2-8783-475b-a961-d5a4110cb452","Type":"ContainerStarted","Data":"00325fd390abfcb658e5387701cd8c98e8285bb75c8b6feaf9931447deb8e216"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.521831 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" event={"ID":"bd443947-7241-49e3-9d98-f55329818dcc","Type":"ContainerStarted","Data":"edb740c8bef17d97c70456de5ee812ae7840e8ba280b1bce72e739c160de5438"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.523510 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" event={"ID":"56318568-cab8-4d5b-9a20-4531fc8aad60","Type":"ContainerStarted","Data":"d3d87ef59ddfee5645a922c56a1a308caadd0cb655dd046c6c70ba66ced1f4e5"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.523534 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" event={"ID":"56318568-cab8-4d5b-9a20-4531fc8aad60","Type":"ContainerStarted","Data":"d99035b44f8e9d301a6148212aba53b6f400724c4c4417749ae1e400595cc5ca"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.524990 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" event={"ID":"0910a8a1-0226-42c8-ab1d-b142d2b7a00d","Type":"ContainerStarted","Data":"2a9593d15ec672301e7be36642cd55d448e399a45c5270574baa719c0272c69d"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.525340 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" event={"ID":"0910a8a1-0226-42c8-ab1d-b142d2b7a00d","Type":"ContainerStarted","Data":"744e83dbeca747713e5bc106c8c1657865723be5795575912f982ea9219abf0c"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.530213 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-74545575db-z8df4" event={"ID":"bd4c0dd5-6c39-4661-b0c6-424d6b061f04","Type":"ContainerStarted","Data":"12c3c52a4b89ff3cdb4a058248706e793e77a03881ad1275cd96a1ef75655d21"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.530238 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-74545575db-z8df4" event={"ID":"bd4c0dd5-6c39-4661-b0c6-424d6b061f04","Type":"ContainerStarted","Data":"a832625cfa0643de09d04cd2785bd616383620f4f27e61a8788477cb8e277991"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.538444 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" event={"ID":"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4","Type":"ContainerStarted","Data":"18112439814edd26fc6f9feabf6054a0029c97046be0d71f72e725b224eebd6e"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.540129 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-55dml" event={"ID":"dcd47220-17b3-4593-9a12-fb67c0c0dcc8","Type":"ContainerStarted","Data":"48bc4f44ea89b3eb16e668295ea44a7a0368109fe874f3f4d2c54a972d3b88ac"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.540159 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-55dml" event={"ID":"dcd47220-17b3-4593-9a12-fb67c0c0dcc8","Type":"ContainerStarted","Data":"73630d9731f678fb90ce32c55553adde10ec0fb4b03b37d708406c8acaaee087"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.543434 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" event={"ID":"c7b12ebb-b568-4d15-abde-14db5041d5d2","Type":"ContainerStarted","Data":"56c33e03ce6f00b602f67882ebe171360b1da7dcfb34b6c4f3bfb4e1912f9cd3"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.543463 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" event={"ID":"c7b12ebb-b568-4d15-abde-14db5041d5d2","Type":"ContainerStarted","Data":"a49956e14fb6aa75f2f9c124c9e4161965fbc0b1c86e79ab712054c63185767d"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.545283 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerStarted","Data":"a6ce4d9e4fad533d2bf3efd803ec0a13bd8f8eba0dea6f48dfd40baebf8e74cc"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.545310 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerStarted","Data":"082b4604427fbc7c5d9bce23172c03602291dda6ccea1696f1f624d1746d3739"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.546828 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" event={"ID":"fc0602bb-5acd-426a-b3d6-3a2effb49bf3","Type":"ContainerStarted","Data":"332dafd2662f48ec31bf3a0a888d939eecc32e94040c64906ee34183b65eb690"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.546854 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" event={"ID":"fc0602bb-5acd-426a-b3d6-3a2effb49bf3","Type":"ContainerStarted","Data":"57c5e3a242c4b7c6908c9b7bb0828fa534c40b70be31283e92b4fbd2c7096def"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.567799 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" event={"ID":"65b5ebf3-054c-4827-96b2-7ea0a26f20af","Type":"ContainerStarted","Data":"4d440f2dfcddc54fd556bfe43abeb04733e8a7617dca10952526a1edc3bc3c85"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.571278 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rkswl" event={"ID":"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a","Type":"ContainerStarted","Data":"dd7d3ee59ad7adc7ee11f8b9aeb0f81d38fef91b3e02560f14a01b0f64ff8f1a"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.591267 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" event={"ID":"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1","Type":"ContainerStarted","Data":"a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.593845 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" event={"ID":"73ebea9b-fc7b-4d54-af53-f6f61e0fce97","Type":"ContainerStarted","Data":"0eaa89fec503d9ea89bbf6737645ec15fbfea7b3c5aaafe399fe76d10fe522f5"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.600935 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" event={"ID":"413bd8dc-5257-4fd7-95c1-01f6d79278ee","Type":"ContainerStarted","Data":"85f1aefdf5628b11ce519a901f58bba27ced3b6d1638a2fc598e624271f63382"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.605940 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" event={"ID":"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1","Type":"ContainerStarted","Data":"dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.608668 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" event={"ID":"93e67bda-2839-4364-9a75-54864090dc1f","Type":"ContainerStarted","Data":"9f662781c4eb461bb13ce5c5759b95a3882a62d90eac7634ad6302e955e47a44"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.608886 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.609012 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.108992776 +0000 UTC m=+115.131294139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.609335 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.610049 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.110029529 +0000 UTC m=+115.132330902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.611471 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" event={"ID":"1bb9e03f-ef85-4dbe-802f-d529e97b092c","Type":"ContainerStarted","Data":"cd9d650c3258cebfd58378de1572e0475377631fb57a20b94d8ed65a47d615be"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.614902 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" event={"ID":"a1258288-8146-4cba-9d66-2a88e35a1fe9","Type":"ContainerStarted","Data":"0b5fbe52956196216d6fe41d0239bf23cd3ebb86c349dfe490f8266571e7bea5"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.615084 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" event={"ID":"a1258288-8146-4cba-9d66-2a88e35a1fe9","Type":"ContainerStarted","Data":"9d8ced1bc6d028d7d194ab8a7ffc574ef4b83cb030688c5fdf0c163bedfc4755"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.617865 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" event={"ID":"b4117709-89bd-4e72-8016-0c25c0ece2c6","Type":"ContainerStarted","Data":"db8eaf8089a615a1169629a8ceff9d51faff3f489ba7e50a66264a1bc3dd2bc6"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.617901 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" event={"ID":"b4117709-89bd-4e72-8016-0c25c0ece2c6","Type":"ContainerStarted","Data":"ee29cde6b79f9126ff7254db86af5e5422a58b97f994568cce87584f05b97cc0"} Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.645478 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.655219 5116 patch_prober.go:28] interesting pod/oauth-openshift-66458b6674-8qfhd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" start-of-body= Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.655288 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.27:6443/healthz\": dial tcp 10.217.0.27:6443: connect: connection refused" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.663639 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" podStartSLOduration=93.663594959 podStartE2EDuration="1m33.663594959s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.663432704 +0000 UTC m=+114.685734107" watchObservedRunningTime="2026-03-22 00:10:43.663594959 +0000 UTC m=+114.685896332" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.687357 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" podStartSLOduration=93.687338782 podStartE2EDuration="1m33.687338782s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.681157746 +0000 UTC m=+114.703459139" watchObservedRunningTime="2026-03-22 00:10:43.687338782 +0000 UTC m=+114.709640155" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.706956 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" podStartSLOduration=93.706937874 podStartE2EDuration="1m33.706937874s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.702972959 +0000 UTC m=+114.725274342" watchObservedRunningTime="2026-03-22 00:10:43.706937874 +0000 UTC m=+114.729239247" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.712023 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.712533 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.212512011 +0000 UTC m=+115.234813384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.713408 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.713413 5116 patch_prober.go:28] interesting pod/olm-operator-5cdf44d969-cp4p2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.713471 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" podUID="fe48c9c2-8783-475b-a961-d5a4110cb452" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.715135 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.215114343 +0000 UTC m=+115.237415716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.723054 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-686468bdd5-pdqfv" podStartSLOduration=93.723035726 podStartE2EDuration="1m33.723035726s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.720972129 +0000 UTC m=+114.743273502" watchObservedRunningTime="2026-03-22 00:10:43.723035726 +0000 UTC m=+114.745337099" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.750634 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-5b9c976747-zvprn" podStartSLOduration=93.75061041 podStartE2EDuration="1m33.75061041s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.740857841 +0000 UTC m=+114.763159214" watchObservedRunningTime="2026-03-22 00:10:43.75061041 +0000 UTC m=+114.772911803" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.773687 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.783103 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" podStartSLOduration=93.78308171 podStartE2EDuration="1m33.78308171s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.761874558 +0000 UTC m=+114.784175931" watchObservedRunningTime="2026-03-22 00:10:43.78308171 +0000 UTC m=+114.805383083" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.783233 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-r75lk" podStartSLOduration=6.783227195 podStartE2EDuration="6.783227195s" podCreationTimestamp="2026-03-22 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.780744916 +0000 UTC m=+114.803046289" watchObservedRunningTime="2026-03-22 00:10:43.783227195 +0000 UTC m=+114.805528598" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.800886 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-565b79b866-tlh4v" podStartSLOduration=93.800873215 podStartE2EDuration="1m33.800873215s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.800530564 +0000 UTC m=+114.822831947" watchObservedRunningTime="2026-03-22 00:10:43.800873215 +0000 UTC m=+114.823174588" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.823145 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.825312 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.32528385 +0000 UTC m=+115.347585233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.883083 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-75ffdb6fcd-m65lb" podStartSLOduration=93.883065613 podStartE2EDuration="1m33.883065613s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.850223771 +0000 UTC m=+114.872525164" watchObservedRunningTime="2026-03-22 00:10:43.883065613 +0000 UTC m=+114.905366976" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.883191 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" podStartSLOduration=93.883185757 podStartE2EDuration="1m33.883185757s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.878123796 +0000 UTC m=+114.900425179" watchObservedRunningTime="2026-03-22 00:10:43.883185757 +0000 UTC m=+114.905487130" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.926805 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:43 crc kubenswrapper[5116]: E0322 00:10:43.927225 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.427209564 +0000 UTC m=+115.449510937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.935806 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.935888 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.953502 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" podStartSLOduration=6.953483307 podStartE2EDuration="6.953483307s" podCreationTimestamp="2026-03-22 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.92172023 +0000 UTC m=+114.944021623" watchObservedRunningTime="2026-03-22 00:10:43.953483307 +0000 UTC m=+114.975784680" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.961884 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-54f497555d-lmct6" podStartSLOduration=93.961863333 podStartE2EDuration="1m33.961863333s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.95955265 +0000 UTC m=+114.981854023" watchObservedRunningTime="2026-03-22 00:10:43.961863333 +0000 UTC m=+114.984164716" Mar 22 00:10:43 crc kubenswrapper[5116]: I0322 00:10:43.965760 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55512: no serving certificate available for the kubelet" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.015859 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-74545575db-z8df4" podStartSLOduration=94.015844486 podStartE2EDuration="1m34.015844486s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:43.982902081 +0000 UTC m=+115.005203454" watchObservedRunningTime="2026-03-22 00:10:44.015844486 +0000 UTC m=+115.038145849" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.017811 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" podStartSLOduration=94.017802158 podStartE2EDuration="1m34.017802158s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:44.014441962 +0000 UTC m=+115.036743355" watchObservedRunningTime="2026-03-22 00:10:44.017802158 +0000 UTC m=+115.040103531" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.029947 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.030377 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.530350787 +0000 UTC m=+115.552652160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.043501 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-55dml" podStartSLOduration=7.043480763 podStartE2EDuration="7.043480763s" podCreationTimestamp="2026-03-22 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:44.041974806 +0000 UTC m=+115.064276179" watchObservedRunningTime="2026-03-22 00:10:44.043480763 +0000 UTC m=+115.065782136" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.133350 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.134035 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.634017266 +0000 UTC m=+115.656318639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.234857 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.234960 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.734936849 +0000 UTC m=+115.757238222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.235100 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.235500 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.735490866 +0000 UTC m=+115.757792239 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.336472 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.336845 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.836828802 +0000 UTC m=+115.859130175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.438392 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.438907 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:44.938893631 +0000 UTC m=+115.961195004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.539244 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.539827 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.039809593 +0000 UTC m=+116.062110956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.624596 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" event={"ID":"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1","Type":"ContainerStarted","Data":"cce1736020766020678b302f9485885375b60ee80f25f29a702ff1bc84c0c923"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.626118 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" event={"ID":"a224637e-e693-4ae7-89c3-1a01e6c9a6f5","Type":"ContainerStarted","Data":"e8d83bd6bf7d126be69f2a9272660c33e633ad82f3127df15394284067264add"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.628046 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" event={"ID":"8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0","Type":"ContainerStarted","Data":"9b8ced341e1f46400da0f6c65b177eea973fdf22bf43e44e35a845682529abf7"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.628861 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" event={"ID":"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee","Type":"ContainerStarted","Data":"f7d437beaa74078aa84a893bb3ee4d62bba1cafb5c44878654f173f2c29422ec"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.629623 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" event={"ID":"0eea7869-af21-4009-856f-65219d64ceea","Type":"ContainerStarted","Data":"6d0104ada666e7e6c2f6c7d2cea8673dd2d3432041ed2f46677719d43d77ffbc"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.630397 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" event={"ID":"73928a0d-7a97-4c03-a5e8-6ab37119261c","Type":"ContainerStarted","Data":"6ca4c8f48df8cb1d6dbce19ee117d828a1c9d132209d18860317344245884603"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.631404 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" event={"ID":"3bf5ae18-6e08-436b-939f-03347eda68a8","Type":"ContainerStarted","Data":"4db8ccc541c061d081a9323a77409d3f74489ae183df9ab8fd0f2e159d724317"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.632765 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" event={"ID":"44b1188c-0fa6-48c7-bf76-6e65ca8174ec","Type":"ContainerStarted","Data":"9b10560cd7a82f92bfd106f54992b60372f8b0733f2557ed6d35133f54e4a049"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.633796 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" event={"ID":"bd443947-7241-49e3-9d98-f55329818dcc","Type":"ContainerStarted","Data":"a989e336f64ff7cf1c58f5a42db0104202ed4b4d051c3f332e2c9e517781f9a7"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.634868 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" event={"ID":"bda6d33d-bd2b-4b65-85c1-767f0c3d1ae4","Type":"ContainerStarted","Data":"204cc0a6efcad763856a3cccc66e8b7f6eb8d5d7d836742b65a51dd8d5c44d68"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.636427 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" event={"ID":"c7b12ebb-b568-4d15-abde-14db5041d5d2","Type":"ContainerStarted","Data":"dbe0e471a0166b6234040c27693dc6ad179beb1af99bb2ed63999c35c7c819b2"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.637991 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" event={"ID":"fc0602bb-5acd-426a-b3d6-3a2effb49bf3","Type":"ContainerStarted","Data":"e9a5c6a924e16844bc09c4b24e0eb4e75544e721253ee37e94fb371c303841c3"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.638982 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" event={"ID":"65b5ebf3-054c-4827-96b2-7ea0a26f20af","Type":"ContainerStarted","Data":"fbe1374ce073ed0e1d47b2e44a86e0a084fb5df52a66da08f5c5d03dd03252af"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.640146 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rkswl" event={"ID":"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a","Type":"ContainerStarted","Data":"9edd670c034ec20665286543b42cdc84695e9c720e6d68f0bf0b804ab6d6f251"} Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.640462 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.640869 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.140851359 +0000 UTC m=+116.163152732 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.741840 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.741998 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.241962848 +0000 UTC m=+116.264264221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.742399 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.742837 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.242812494 +0000 UTC m=+116.265113887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.758913 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.758963 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.760626 5116 patch_prober.go:28] interesting pod/apiserver-8596bd845d-f59q2 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.760705 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" podUID="68cdf6e7-fccd-4375-9688-7a2bcbefd82f" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.843777 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.843938 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.343907592 +0000 UTC m=+116.366208965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.844229 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.844563 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.344551043 +0000 UTC m=+116.366852416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.937936 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:44 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:44 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:44 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.938006 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.945122 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.945245 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.445225207 +0000 UTC m=+116.467526580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.945654 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:44 crc kubenswrapper[5116]: E0322 00:10:44.945934 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.44592352 +0000 UTC m=+116.468224903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.965122 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.965277 5116 patch_prober.go:28] interesting pod/olm-operator-5cdf44d969-cp4p2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.965338 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" podUID="fe48c9c2-8783-475b-a961-d5a4110cb452" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 22 00:10:44 crc kubenswrapper[5116]: I0322 00:10:44.990707 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.046966 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.047399 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.547273636 +0000 UTC m=+116.569575009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.056268 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.059863 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.559840654 +0000 UTC m=+116.582142257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.158029 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.158464 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.658447753 +0000 UTC m=+116.680749126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.188949 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-57nbs"] Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.259612 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.260021 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.760004036 +0000 UTC m=+116.782305409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.267887 5116 ???:1] "http: TLS handshake error from 192.168.126.11:55520: no serving certificate available for the kubelet" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.357150 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.357856 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.358110 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.358670 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.358701 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.360381 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.360535 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.860514195 +0000 UTC m=+116.882815568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361055 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361278 5116 patch_prober.go:28] interesting pod/console-operator-67c89758df-vnd4f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361327 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" podUID="b4117709-89bd-4e72-8016-0c25c0ece2c6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361407 5116 patch_prober.go:28] interesting pod/packageserver-7d4fc7d867-td5gr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.361428 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.861414863 +0000 UTC m=+116.883716236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361430 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" podUID="a224637e-e693-4ae7-89c3-1a01e6c9a6f5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361484 5116 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-lf2zm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361510 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361537 5116 patch_prober.go:28] interesting pod/catalog-operator-75ff9f647d-47j6l container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.361572 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" podUID="a1258288-8146-4cba-9d66-2a88e35a1fe9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.371675 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" podStartSLOduration=95.371657259 podStartE2EDuration="1m35.371657259s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.37106343 +0000 UTC m=+116.393364823" watchObservedRunningTime="2026-03-22 00:10:45.371657259 +0000 UTC m=+116.393958642" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.397018 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" podStartSLOduration=95.396996313 podStartE2EDuration="1m35.396996313s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.391814777 +0000 UTC m=+116.414116150" watchObservedRunningTime="2026-03-22 00:10:45.396996313 +0000 UTC m=+116.419297686" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.444786 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" podStartSLOduration=96.444770499 podStartE2EDuration="1m36.444770499s" podCreationTimestamp="2026-03-22 00:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.428096159 +0000 UTC m=+116.450397542" watchObservedRunningTime="2026-03-22 00:10:45.444770499 +0000 UTC m=+116.467071872" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.462740 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.464658 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:45.964640348 +0000 UTC m=+116.986941721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.475203 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-69b85846b6-l4jwl" podStartSLOduration=95.475183274 podStartE2EDuration="1m35.475183274s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.446490133 +0000 UTC m=+116.468791506" watchObservedRunningTime="2026-03-22 00:10:45.475183274 +0000 UTC m=+116.497484657" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.476710 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" podStartSLOduration=95.476701031 podStartE2EDuration="1m35.476701031s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.474619306 +0000 UTC m=+116.496920699" watchObservedRunningTime="2026-03-22 00:10:45.476701031 +0000 UTC m=+116.499002424" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.565518 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.566019 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.066002975 +0000 UTC m=+117.088304348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.649929 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" event={"ID":"cf866f04-6739-40da-8c1c-36d192472220","Type":"ContainerStarted","Data":"98b80ed2f825f571051f8dfd1b02b4796223af2136881bbb7ab621d86c90d337"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.654944 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" event={"ID":"50968323-e2f8-4e7b-8cdb-8eb4c4cb5dee","Type":"ContainerStarted","Data":"04df04b126b799a76ce4d5634b109696cac20a5d85878acb5c48cd2efa1e4072"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.658586 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" event={"ID":"0eea7869-af21-4009-856f-65219d64ceea","Type":"ContainerStarted","Data":"a42679411685c6512bbfe6984a3a9a428152fe26ec7d0bfc12531ebbd29fbab8"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.661751 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" event={"ID":"2512f5ef-a611-4637-b41f-41185def421b","Type":"ContainerStarted","Data":"80594bbc1ca27647c8954e15361d62b3ad3a7e1a7eec0475d5e013d18ce330ec"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.666521 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.666887 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.166871496 +0000 UTC m=+117.189172869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.667908 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6b564684c8-dfx6t" podStartSLOduration=95.667883758 podStartE2EDuration="1m35.667883758s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.498073489 +0000 UTC m=+116.520374872" watchObservedRunningTime="2026-03-22 00:10:45.667883758 +0000 UTC m=+116.690185141" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.667984 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" event={"ID":"73928a0d-7a97-4c03-a5e8-6ab37119261c","Type":"ContainerStarted","Data":"ee91d3ca0e39b56fda13e403a733c1c343b35f8469fa149daa62e5e7bdf51948"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.668222 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-575994946d-jxbnl" podStartSLOduration=95.668215879 podStartE2EDuration="1m35.668215879s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.665902285 +0000 UTC m=+116.688203658" watchObservedRunningTime="2026-03-22 00:10:45.668215879 +0000 UTC m=+116.690517262" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.672409 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" event={"ID":"bd443947-7241-49e3-9d98-f55329818dcc","Type":"ContainerStarted","Data":"0a1399231ec5c949259d7c311833b826b69a3a1d154f34f94c926c838d5b3c82"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.686757 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" gracePeriod=30 Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.687782 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rkswl" event={"ID":"0ae3ce26-1619-4bc6-925c-a4fb4e41cd9a","Type":"ContainerStarted","Data":"6990c979394f4a46fe115e5e48d81ac6083326bf20dc3e26298eb49139789f17"} Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.687821 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.689281 5116 patch_prober.go:28] interesting pod/olm-operator-5cdf44d969-cp4p2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.689323 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" podUID="fe48c9c2-8783-475b-a961-d5a4110cb452" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.689814 5116 patch_prober.go:28] interesting pod/catalog-operator-75ff9f647d-47j6l container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.689874 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" podUID="a1258288-8146-4cba-9d66-2a88e35a1fe9" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.695463 5116 patch_prober.go:28] interesting pod/packageserver-7d4fc7d867-td5gr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.695521 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" podUID="a224637e-e693-4ae7-89c3-1a01e6c9a6f5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.695881 5116 patch_prober.go:28] interesting pod/console-operator-67c89758df-vnd4f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.695909 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" podUID="b4117709-89bd-4e72-8016-0c25c0ece2c6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/readyz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.696108 5116 patch_prober.go:28] interesting pod/marketplace-operator-547dbd544d-lf2zm container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.696233 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.759269 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-69d5f845f8-sk5s4" podStartSLOduration=95.759254017 podStartE2EDuration="1m35.759254017s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.727719976 +0000 UTC m=+116.750021349" watchObservedRunningTime="2026-03-22 00:10:45.759254017 +0000 UTC m=+116.781555390" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.763123 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-67c9d58cbb-l9f6f" podStartSLOduration=95.76310565 podStartE2EDuration="1m35.76310565s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.757887964 +0000 UTC m=+116.780189347" watchObservedRunningTime="2026-03-22 00:10:45.76310565 +0000 UTC m=+116.785407013" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.769719 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.771714 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.271693482 +0000 UTC m=+117.293994945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.787537 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-f9cdd68f7-wfffg" podStartSLOduration=95.787514704 podStartE2EDuration="1m35.787514704s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.776952769 +0000 UTC m=+116.799254152" watchObservedRunningTime="2026-03-22 00:10:45.787514704 +0000 UTC m=+116.809816077" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.799661 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rkswl" podStartSLOduration=8.799644699 podStartE2EDuration="8.799644699s" podCreationTimestamp="2026-03-22 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.799399901 +0000 UTC m=+116.821701284" watchObservedRunningTime="2026-03-22 00:10:45.799644699 +0000 UTC m=+116.821946072" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.815710 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" podStartSLOduration=95.815693958 podStartE2EDuration="1m35.815693958s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.814840321 +0000 UTC m=+116.837141704" watchObservedRunningTime="2026-03-22 00:10:45.815693958 +0000 UTC m=+116.837995331" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.840668 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-69db94689b-bkst6" podStartSLOduration=95.8406517 podStartE2EDuration="1m35.8406517s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.837526261 +0000 UTC m=+116.859827634" watchObservedRunningTime="2026-03-22 00:10:45.8406517 +0000 UTC m=+116.862953073" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.865464 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-6b9cb4dbcf-qdtdp" podStartSLOduration=95.865439696 podStartE2EDuration="1m35.865439696s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.862237985 +0000 UTC m=+116.884539378" watchObservedRunningTime="2026-03-22 00:10:45.865439696 +0000 UTC m=+116.887741069" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.876639 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.877039 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.377018704 +0000 UTC m=+117.399320077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.877182 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.877623 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.377613673 +0000 UTC m=+117.399915046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.883494 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-866fcbc849-m5rc6" podStartSLOduration=95.883477639 podStartE2EDuration="1m35.883477639s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:45.882910871 +0000 UTC m=+116.905212264" watchObservedRunningTime="2026-03-22 00:10:45.883477639 +0000 UTC m=+116.905779022" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.935609 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:45 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:45 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:45 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.935699 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.978412 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.978588 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.478558136 +0000 UTC m=+117.500859509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:45 crc kubenswrapper[5116]: I0322 00:10:45.978808 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:45 crc kubenswrapper[5116]: E0322 00:10:45.979137 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.479118024 +0000 UTC m=+117.501419397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.079710 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.079858 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.57983615 +0000 UTC m=+117.602137523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.079966 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.080315 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.580304805 +0000 UTC m=+117.602606178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.181491 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.181733 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.681696072 +0000 UTC m=+117.703997445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.182123 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.182692 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.682649582 +0000 UTC m=+117.704950985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.284013 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.284262 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.784230326 +0000 UTC m=+117.806531699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.284867 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.285220 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.785206206 +0000 UTC m=+117.807507579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.357831 5116 patch_prober.go:28] interesting pod/oauth-openshift-66458b6674-8qfhd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.27:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.357914 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.27:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.385949 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.386361 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.886344635 +0000 UTC m=+117.908646008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.488337 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.488747 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:46.988725045 +0000 UTC m=+118.011026478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.590192 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.590364 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.090331248 +0000 UTC m=+118.112632621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.590587 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.590903 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.090890766 +0000 UTC m=+118.113192139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.691478 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.691719 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.191677584 +0000 UTC m=+118.213978957 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.692527 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.692924 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.192915674 +0000 UTC m=+118.215217047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.695200 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" event={"ID":"65b5ebf3-054c-4827-96b2-7ea0a26f20af","Type":"ContainerStarted","Data":"cb8e992c97289e8677dffbf9ed31d0c2a9abf46b9937c4c28ac09206f90119aa"} Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.697210 5116 patch_prober.go:28] interesting pod/packageserver-7d4fc7d867-td5gr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.697262 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" podUID="a224637e-e693-4ae7-89c3-1a01e6c9a6f5" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.724710 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-799b87ffcd-42tp2" podStartSLOduration=96.724689471 podStartE2EDuration="1m36.724689471s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:46.72244969 +0000 UTC m=+117.744751073" watchObservedRunningTime="2026-03-22 00:10:46.724689471 +0000 UTC m=+117.746990854" Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.793795 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.794031 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.293995081 +0000 UTC m=+118.316296474 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.794728 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.796383 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.296368416 +0000 UTC m=+118.318669789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.895798 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.896206 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.396188163 +0000 UTC m=+118.418489536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.941705 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:46 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:46 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:46 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.941774 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:46 crc kubenswrapper[5116]: I0322 00:10:46.997675 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:46 crc kubenswrapper[5116]: E0322 00:10:46.997974 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.497961773 +0000 UTC m=+118.520263146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.099222 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.099526 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.599508175 +0000 UTC m=+118.621809558 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.201485 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.201921 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.701900854 +0000 UTC m=+118.724202227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.302931 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.303444 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.803428106 +0000 UTC m=+118.825729469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.404962 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.405375 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:47.90536214 +0000 UTC m=+118.927663513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.506589 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.506985 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.006967995 +0000 UTC m=+119.029269368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.608250 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.608614 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.10859476 +0000 UTC m=+119.130896133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.613452 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.710144 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.710570 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.210545814 +0000 UTC m=+119.232847207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.812005 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.812318 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.312304613 +0000 UTC m=+119.334605986 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.855410 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.859859 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler\"/\"installer-sa-dockercfg-qpkss\"" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.862158 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" event={"ID":"1bb9e03f-ef85-4dbe-802f-d529e97b092c","Type":"ContainerStarted","Data":"14cfa7846d0313cc9092a8f3bb677b9d20ea75addf2ca3bd62b03f25f030db5b"} Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.862209 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.862713 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler\"/\"kube-root-ca.crt\"" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.882051 5116 ???:1] "http: TLS handshake error from 192.168.126.11:48434: no serving certificate available for the kubelet" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.913892 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.913989 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.413964989 +0000 UTC m=+119.436266362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.914079 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.914116 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.914428 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:47 crc kubenswrapper[5116]: E0322 00:10:47.914468 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.414459924 +0000 UTC m=+119.436761297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.939544 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:47 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:47 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:47 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:47 crc kubenswrapper[5116]: I0322 00:10:47.939631 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.015422 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.015653 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.515617054 +0000 UTC m=+119.537918427 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.015795 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.015949 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.015961 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.016069 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.016265 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.516256725 +0000 UTC m=+119.538558168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.073050 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access\") pod \"revision-pruner-6-crc\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.117883 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.118405 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.618386666 +0000 UTC m=+119.640688039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.190202 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.219302 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.219720 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.7197075 +0000 UTC m=+119.742008873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.320460 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.320776 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.820759357 +0000 UTC m=+119.843060730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.424148 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.425029 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:48.925013505 +0000 UTC m=+119.947314878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.532402 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.532778 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.032761285 +0000 UTC m=+120.055062658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.633817 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.634127 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.13411498 +0000 UTC m=+120.156416353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.640871 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/revision-pruner-6-crc"] Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.690548 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.692259 5116 patch_prober.go:28] interesting pod/openshift-config-operator-5777786469-wb6r8 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.692310 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" podUID="44b1188c-0fa6-48c7-bf76-6e65ca8174ec" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.712450 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"06481163-cdc1-4b43-b6c2-73f2672feb42","Type":"ContainerStarted","Data":"5fd306e69fede60242c9ecd2db58c2c9e041c89fdb800077d2c6965a559bbb92"} Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.734549 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.734946 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.234927219 +0000 UTC m=+120.257228592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.836266 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.836804 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.336789972 +0000 UTC m=+120.359091345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.852052 5116 patch_prober.go:28] interesting pod/openshift-config-operator-5777786469-wb6r8 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.852117 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" podUID="44b1188c-0fa6-48c7-bf76-6e65ca8174ec" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.852378 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.935748 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:48 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:48 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:48 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.935971 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.937630 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.937794 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.437773676 +0000 UTC m=+120.460075049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:48 crc kubenswrapper[5116]: I0322 00:10:48.938008 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:48 crc kubenswrapper[5116]: E0322 00:10:48.938358 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.438345534 +0000 UTC m=+120.460646907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.039319 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.039678 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.539662208 +0000 UTC m=+120.561963581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.140835 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.141300 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.641279603 +0000 UTC m=+120.663580976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.241906 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.242158 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.742124683 +0000 UTC m=+120.764426056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.280781 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.343982 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.344352 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.844337607 +0000 UTC m=+120.866638980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.367258 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.367409 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.379648 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.434783 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.434846 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.445127 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.445321 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.945296231 +0000 UTC m=+120.967597604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.445697 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.445738 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh2v2\" (UniqueName: \"kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.445768 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.445954 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.446243 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:49.946151217 +0000 UTC m=+120.968452640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.468907 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.475308 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.478499 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.487639 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.546970 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.547124 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.0471008 +0000 UTC m=+121.069402173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547374 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cpmn\" (UniqueName: \"kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547471 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547813 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547847 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547880 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547896 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547953 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh2v2\" (UniqueName: \"kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.547997 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.548486 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.048467574 +0000 UTC m=+121.070768947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.548482 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.556936 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.556997 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.590607 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh2v2\" (UniqueName: \"kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2\") pod \"certified-operators-t4x6l\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.649288 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.649472 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.149442268 +0000 UTC m=+121.171743641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.649908 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cpmn\" (UniqueName: \"kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.650129 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.650161 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.650230 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.650526 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.150517452 +0000 UTC m=+121.172818825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.650656 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.650664 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.679405 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cpmn\" (UniqueName: \"kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn\") pod \"community-operators-zrcmf\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.682138 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.685444 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.692533 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.757010 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.757575 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.257543978 +0000 UTC m=+121.279845351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.792057 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.805816 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.806434 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.819822 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.819877 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"06481163-cdc1-4b43-b6c2-73f2672feb42","Type":"ContainerStarted","Data":"5eba216ccdb085e6ee94d8295dcb56560aa662e5daa7c913adf5cc47ec899e24"} Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.819985 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.821185 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.847364 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-8596bd845d-f59q2" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.847400 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"3a14caf222afb62aaabdc47808b6f944","Type":"ContainerStarted","Data":"dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f"} Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.848117 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.859414 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.859565 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4dfx\" (UniqueName: \"kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.859621 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.859657 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.861147 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.361132675 +0000 UTC m=+121.383434048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.910263 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.939208 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.941311 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.951656 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:49 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:49 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:49 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.951759 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.961837 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.962010 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.962071 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4dfx\" (UniqueName: \"kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.962118 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.962885 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:49 crc kubenswrapper[5116]: E0322 00:10:49.962963 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.462934216 +0000 UTC m=+121.485235579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:49 crc kubenswrapper[5116]: I0322 00:10:49.963199 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.022892 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4dfx\" (UniqueName: \"kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx\") pod \"certified-operators-hppqv\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.052944 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/revision-pruner-6-crc" podStartSLOduration=3.052924161 podStartE2EDuration="3.052924161s" podCreationTimestamp="2026-03-22 00:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:49.993213766 +0000 UTC m=+121.015515149" watchObservedRunningTime="2026-03-22 00:10:50.052924161 +0000 UTC m=+121.075225534" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.065628 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.065687 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.065713 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.065796 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbjjc\" (UniqueName: \"kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.066065 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.566053408 +0000 UTC m=+121.588354781 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.159022 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.166370 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.166519 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.166608 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbjjc\" (UniqueName: \"kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.166663 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.167043 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.167124 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.667109484 +0000 UTC m=+121.689410857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.167365 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.225193 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=28.225177067 podStartE2EDuration="28.225177067s" podCreationTimestamp="2026-03-22 00:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:50.223488963 +0000 UTC m=+121.245790346" watchObservedRunningTime="2026-03-22 00:10:50.225177067 +0000 UTC m=+121.247478440" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.238011 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbjjc\" (UniqueName: \"kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc\") pod \"community-operators-npbn6\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.251261 5116 patch_prober.go:28] interesting pod/openshift-config-operator-5777786469-wb6r8 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded" start-of-body= Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.251336 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" podUID="44b1188c-0fa6-48c7-bf76-6e65ca8174ec" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": context deadline exceeded" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.268907 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.269287 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.769269586 +0000 UTC m=+121.791570959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.277955 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.339891 5116 patch_prober.go:28] interesting pod/apiserver-9ddfb9f55-m5dds container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]log ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]etcd ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/generic-apiserver-start-informers ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/max-in-flight-filter ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 22 00:10:50 crc kubenswrapper[5116]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 22 00:10:50 crc kubenswrapper[5116]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/project.openshift.io-projectcache ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/openshift.io-startinformers ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 22 00:10:50 crc kubenswrapper[5116]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 22 00:10:50 crc kubenswrapper[5116]: livez check failed Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.339974 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" podUID="8e08ce20-926b-4ed4-a4b7-c1dcd1ab28b0" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.369937 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.370529 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.870509648 +0000 UTC m=+121.892811021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.406596 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.461656 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.461688 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.478089 5116 patch_prober.go:28] interesting pod/console-64d44f6ddf-9g5sg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.478154 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-64d44f6ddf-9g5sg" podUID="4c2755ce-817d-47b0-9f19-7218641d0c5b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.479236 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.479701 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:50.979685722 +0000 UTC m=+122.001987095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.543278 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.585031 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.586649 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.086626766 +0000 UTC m=+122.108928139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.687040 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.687466 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.187421324 +0000 UTC m=+122.209722697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.735221 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.788594 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.788948 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.288928876 +0000 UTC m=+122.311230259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.847265 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerStarted","Data":"58966d6ce1821ab01be1615acae0ea7c0ed73c7caaf2ca09de37bd0dfaf8db3c"} Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.848227 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerStarted","Data":"c8dbd89a41371e9d08f38390365ebbf2b2a5481a8e6093a86e6911bc41519ed3"} Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.849391 5116 generic.go:358] "Generic (PLEG): container finished" podID="06481163-cdc1-4b43-b6c2-73f2672feb42" containerID="5eba216ccdb085e6ee94d8295dcb56560aa662e5daa7c913adf5cc47ec899e24" exitCode=0 Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.849442 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"06481163-cdc1-4b43-b6c2-73f2672feb42","Type":"ContainerDied","Data":"5eba216ccdb085e6ee94d8295dcb56560aa662e5daa7c913adf5cc47ec899e24"} Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.871473 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4x6l" event={"ID":"da3b0eb3-e48f-4080-bfdc-522f18cf2876","Type":"ContainerStarted","Data":"96e17397bb6bb8c0e2ce3437a4637533d85e31f9659bc8478852a483b13dd5dd"} Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.890036 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.890399 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.390371384 +0000 UTC m=+122.412672757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.932226 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.935026 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:50 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:50 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:50 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.935107 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.990889 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.991047 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.491021868 +0000 UTC m=+122.513323241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.991436 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:50 crc kubenswrapper[5116]: E0322 00:10:50.992780 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.492762453 +0000 UTC m=+122.515063936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:50 crc kubenswrapper[5116]: I0322 00:10:50.992881 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.092720 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.093094 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.593079306 +0000 UTC m=+122.615380679 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.194324 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.194706 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.694687461 +0000 UTC m=+122.716988834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.295855 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.296053 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.796024746 +0000 UTC m=+122.818326119 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.296731 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.297088 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.797072059 +0000 UTC m=+122.819373432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.398502 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.398688 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.898649693 +0000 UTC m=+122.920951066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.398996 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.399311 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:51.899301464 +0000 UTC m=+122.921602837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.467581 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.500191 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.500763 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.000743672 +0000 UTC m=+123.023045045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.601988 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.602359 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.102343456 +0000 UTC m=+123.124644829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.702762 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.703184 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.203153545 +0000 UTC m=+123.225454918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.804151 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.804589 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.304573853 +0000 UTC m=+123.326875236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.876373 5116 generic.go:358] "Generic (PLEG): container finished" podID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerID="48d478e51d471e4228cfd3bf987de6d6edc345926b49b645231a761cba1fbba7" exitCode=0 Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.879200 5116 generic.go:358] "Generic (PLEG): container finished" podID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerID="4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791" exitCode=0 Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.880850 5116 generic.go:358] "Generic (PLEG): container finished" podID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerID="650e2199c9bfdd8d8095c16f7e86df5e23a5cedd710ebf3a93a1b3818c4e5743" exitCode=0 Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.905769 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.905940 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.405913529 +0000 UTC m=+123.428214902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.906158 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:51 crc kubenswrapper[5116]: E0322 00:10:51.906541 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.406523507 +0000 UTC m=+123.428824960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.936754 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:51 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:51 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:51 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:51 crc kubenswrapper[5116]: I0322 00:10:51.936823 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.008665 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.009078 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.509061362 +0000 UTC m=+123.531362735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.110556 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.111009 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.610989886 +0000 UTC m=+123.633291259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.211541 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.211717 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.711684101 +0000 UTC m=+123.733985484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.212044 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.212424 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.712410644 +0000 UTC m=+123.734712017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.313923 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.314239 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.814206784 +0000 UTC m=+123.836508167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.314874 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315033 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4x6l" event={"ID":"da3b0eb3-e48f-4080-bfdc-522f18cf2876","Type":"ContainerDied","Data":"48d478e51d471e4228cfd3bf987de6d6edc345926b49b645231a761cba1fbba7"} Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315214 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315266 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerStarted","Data":"45e4dcc16acc5b874128c4be6959ee16a39761f0d8b86acfc6b789f6f799c066"} Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315290 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerDied","Data":"4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791"} Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315358 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-5777786469-wb6r8" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.315415 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.319652 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.819635527 +0000 UTC m=+123.841936900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.321585 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.328467 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerDied","Data":"650e2199c9bfdd8d8095c16f7e86df5e23a5cedd710ebf3a93a1b3818c4e5743"} Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.328527 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.416588 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.416862 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.416931 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.416963 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwwx5\" (UniqueName: \"kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.417974 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:52.917953826 +0000 UTC m=+123.940255209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.459656 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.459854 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.473358 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.497721 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.497849 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.517944 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518005 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518036 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cksgt\" (UniqueName: \"kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518076 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518105 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518132 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.518153 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwwx5\" (UniqueName: \"kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.519669 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.019652943 +0000 UTC m=+124.041954316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.519750 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.519982 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.544989 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.559709 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwwx5\" (UniqueName: \"kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5\") pod \"redhat-marketplace-kp7rb\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.562346 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.562531 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.569300 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.619548 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.619773 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.119735599 +0000 UTC m=+124.142036992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.619891 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir\") pod \"06481163-cdc1-4b43-b6c2-73f2672feb42\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620030 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "06481163-cdc1-4b43-b6c2-73f2672feb42" (UID: "06481163-cdc1-4b43-b6c2-73f2672feb42"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620073 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access\") pod \"06481163-cdc1-4b43-b6c2-73f2672feb42\" (UID: \"06481163-cdc1-4b43-b6c2-73f2672feb42\") " Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620243 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620291 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8ktl\" (UniqueName: \"kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620344 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620392 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cksgt\" (UniqueName: \"kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620486 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620530 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.620560 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.120543394 +0000 UTC m=+124.142844757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620827 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.620834 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.621050 5116 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/06481163-cdc1-4b43-b6c2-73f2672feb42-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.621334 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.627855 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "06481163-cdc1-4b43-b6c2-73f2672feb42" (UID: "06481163-cdc1-4b43-b6c2-73f2672feb42"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.639646 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cksgt\" (UniqueName: \"kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt\") pod \"redhat-marketplace-wlccm\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.669207 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.683357 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06481163-cdc1-4b43-b6c2-73f2672feb42" containerName="pruner" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.683395 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="06481163-cdc1-4b43-b6c2-73f2672feb42" containerName="pruner" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.683690 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="06481163-cdc1-4b43-b6c2-73f2672feb42" containerName="pruner" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.685758 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.723282 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.723620 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.223596655 +0000 UTC m=+124.245898028 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.726185 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.726376 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.726544 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.726649 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8ktl\" (UniqueName: \"kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.726798 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/06481163-cdc1-4b43-b6c2-73f2672feb42-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.727608 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.727933 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.728390 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.228375296 +0000 UTC m=+124.250676669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.747643 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8ktl\" (UniqueName: \"kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl\") pod \"redhat-operators-wss9d\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.789706 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.828645 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.828898 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.328870855 +0000 UTC m=+124.351172238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.829377 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.829899 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.329890528 +0000 UTC m=+124.352191901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.890715 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.903413 5116 generic.go:358] "Generic (PLEG): container finished" podID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerID="bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867" exitCode=0 Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.930787 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.931092 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.431065557 +0000 UTC m=+124.453366930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.931496 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:52 crc kubenswrapper[5116]: E0322 00:10:52.931873 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.431851873 +0000 UTC m=+124.454153306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.934643 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:52 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:52 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:52 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:52 crc kubenswrapper[5116]: I0322 00:10:52.934706 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.033269 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.033561 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.533527059 +0000 UTC m=+124.555828442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.034039 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.034367 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.534353556 +0000 UTC m=+124.556654929 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.057214 5116 ???:1] "http: TLS handshake error from 192.168.126.11:48436: no serving certificate available for the kubelet" Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.135797 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.136250 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.636229998 +0000 UTC m=+124.658531371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.237636 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.237978 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.737965236 +0000 UTC m=+124.760266609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.339048 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.339225 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.839197349 +0000 UTC m=+124.861498722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.339467 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.339764 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.839751246 +0000 UTC m=+124.862052619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.440527 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.440834 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.940783282 +0000 UTC m=+124.963084655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.441393 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.441718 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:53.941702301 +0000 UTC m=+124.964003674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.542469 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.542692 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.042656014 +0000 UTC m=+125.064957407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.543101 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.543465 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.04344877 +0000 UTC m=+125.065750213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.643914 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.644106 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.144078893 +0000 UTC m=+125.166380266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.644603 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.644959 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.14494254 +0000 UTC m=+125.167243913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.746356 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.746581 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.246548324 +0000 UTC m=+125.268849707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.746777 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.747138 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.247119742 +0000 UTC m=+125.269421115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.847607 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.847837 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.347807577 +0000 UTC m=+125.370108950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.847941 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.848352 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.348334144 +0000 UTC m=+125.370635517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.934851 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:53 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:53 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:53 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.934926 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.949525 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.949712 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.44967492 +0000 UTC m=+125.471976293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:53 crc kubenswrapper[5116]: I0322 00:10:53.950341 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:53 crc kubenswrapper[5116]: E0322 00:10:53.950713 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.450698082 +0000 UTC m=+125.472999455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.051462 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.051657 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.551629515 +0000 UTC m=+125.573930888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.051937 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.052254 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.552242384 +0000 UTC m=+125.574543757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.116694 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/revision-pruner-6-crc" event={"ID":"06481163-cdc1-4b43-b6c2-73f2672feb42","Type":"ContainerDied","Data":"5fd306e69fede60242c9ecd2db58c2c9e041c89fdb800077d2c6965a559bbb92"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.116748 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fd306e69fede60242c9ecd2db58c2c9e041c89fdb800077d2c6965a559bbb92" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.116806 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/revision-pruner-6-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.116998 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125037 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125080 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125092 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerDied","Data":"bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125115 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125134 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125153 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" event={"ID":"1bb9e03f-ef85-4dbe-802f-d529e97b092c","Type":"ContainerStarted","Data":"75f960061cbc0329dfd322c24b6916521266afa146009455e9c761967544b284"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.125186 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.153413 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.153609 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.65357684 +0000 UTC m=+125.675878213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.153731 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.154118 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.654102017 +0000 UTC m=+125.676403490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.254744 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.254921 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.754891195 +0000 UTC m=+125.777192568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.255268 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.255303 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.255344 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvgbs\" (UniqueName: \"kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.255826 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.256211 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.756188926 +0000 UTC m=+125.778490309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.356751 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.356924 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.856894711 +0000 UTC m=+125.879196084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.357365 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.357438 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.357471 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.357513 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvgbs\" (UniqueName: \"kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.357884 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.857867572 +0000 UTC m=+125.880168945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.357888 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.358026 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.376807 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvgbs\" (UniqueName: \"kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs\") pod \"redhat-operators-fbgnq\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.430590 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerStarted","Data":"96be2eef25c44ecc457a8a1fa10ad35be205dda004792bd0cccdb43de654bdc4"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.430652 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerStarted","Data":"82447639f8664a7a9be68c50329975aae58c8919cedaf2554c6f5ebb2a14ac22"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.430676 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerStarted","Data":"4076d8a97891e463c22fe9847edcf67c692b44a8dbcd9aa75ba00b5c2c7fdc81"} Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.430694 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.430805 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.432625 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.432642 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.439862 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.440802 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.460000 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.460367 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:54.960350144 +0000 UTC m=+125.982651517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.520863 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-9ddfb9f55-m5dds" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.563470 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.563532 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.563563 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.563602 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.564677 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.564788 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.564991 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.567609 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.067585697 +0000 UTC m=+126.089887070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.568006 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.569047 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.571737 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.579589 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.587056 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-networking-console-plugin-cert\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.602400 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a9ae5f6-97bd-46ac-bafa-ca1b4452a141-nginx-conf\") pod \"networking-console-plugin-5ff7774fd9-nljh6\" (UID: \"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141\") " pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.608472 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwt8b\" (UniqueName: \"kubernetes.io/projected/17b87002-b798-480a-8e17-83053d698239-kube-api-access-gwt8b\") pod \"network-check-target-fhkjl\" (UID: \"17b87002-b798-480a-8e17-83053d698239\") " pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.609271 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7w75\" (UniqueName: \"kubernetes.io/projected/f863fff9-286a-45fa-b8f0-8a86994b8440-kube-api-access-l7w75\") pod \"network-check-source-5bb8f5cd97-xdvz5\" (UID: \"f863fff9-286a-45fa-b8f0-8a86994b8440\") " pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.622487 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.641483 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.653369 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.666081 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.666353 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.16632136 +0000 UTC m=+126.188622733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.666484 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.667407 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.667459 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.668868 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.16885422 +0000 UTC m=+126.191155593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.669061 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.694967 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access\") pod \"revision-pruner-11-crc\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.768843 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.769448 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.269431812 +0000 UTC m=+126.291733185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.835544 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.854499 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.871156 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.872706 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.372684848 +0000 UTC m=+126.394986221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.945379 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:54 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:54 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:54 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.945433 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.949781 5116 generic.go:358] "Generic (PLEG): container finished" podID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerID="8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5" exitCode=0 Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.949887 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerDied","Data":"8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5"} Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.969965 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.970196 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerStarted","Data":"60acf1b394737b7d397286a07410ebda0e8083a98b6b46d3e3761b9f5dd3c90c"} Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.972578 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.973314 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.973481 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.473455996 +0000 UTC m=+126.495757379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.974133 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.974187 5116 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Mar 22 00:10:54 crc kubenswrapper[5116]: I0322 00:10:54.974449 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:54 crc kubenswrapper[5116]: E0322 00:10:54.974733 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.474722535 +0000 UTC m=+126.497023908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.075747 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.075899 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.575877375 +0000 UTC m=+126.598178748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.077008 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.078957 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.578937052 +0000 UTC m=+126.601238425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.142085 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-11-crc"] Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.180504 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.180799 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.680766444 +0000 UTC m=+126.703067817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.287485 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.288113 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.78809852 +0000 UTC m=+126.810399893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.347528 5116 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.363716 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.390627 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.390983 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.890962124 +0000 UTC m=+126.913263497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.492886 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.493214 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:55.993198497 +0000 UTC m=+127.015499870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.593666 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.593897 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:56.093880522 +0000 UTC m=+127.116181895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.693736 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-5cdf44d969-cp4p2" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.694727 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.694986 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName: nodeName:}" failed. No retries permitted until 2026-03-22 00:10:56.19497477 +0000 UTC m=+127.217276143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "image-registry-66587d64c8-zwkhp" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.695936 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-75ff9f647d-47j6l" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.717073 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rkswl" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.717134 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-67c89758df-vnd4f" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.717184 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.803800 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:55 crc kubenswrapper[5116]: E0322 00:10:55.804954 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2 podName:9e9b5059-1b3e-4067-a63d-2952cbe863af nodeName:}" failed. No retries permitted until 2026-03-22 00:10:56.30493441 +0000 UTC m=+127.327235793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.806679 5116 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-22T00:10:55.347558696Z","UUID":"eb6ef366-15c5-4ad3-9363-4593b9e8b329","Handler":null,"Name":"","Endpoint":""} Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.809827 5116 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.809857 5116 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.906882 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.912945 5116 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.912984 5116 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b1264ac67579ad07e7e9003054d44fe40dd55285a4b2f7dc74e48be1aee0868a/globalmount\"" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.936350 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:55 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:55 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:55 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.936608 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:55 crc kubenswrapper[5116]: I0322 00:10:55.992049 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-66587d64c8-zwkhp\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.001971 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"087356d8-050c-4861-8799-76df7a8330cb","Type":"ContainerStarted","Data":"130739c8c4fc9560744543b32a2dedebe00df0b00c06be853ce09fafeeaac564"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.002054 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"087356d8-050c-4861-8799-76df7a8330cb","Type":"ContainerStarted","Data":"25a14810c25178d5701d5a77da67d4d6de2709a0d6bd4a338a91bc57a5e63c27"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.013393 5116 generic.go:358] "Generic (PLEG): container finished" podID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerID="60acf1b394737b7d397286a07410ebda0e8083a98b6b46d3e3761b9f5dd3c90c" exitCode=0 Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.013559 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerDied","Data":"60acf1b394737b7d397286a07410ebda0e8083a98b6b46d3e3761b9f5dd3c90c"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.013681 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"9e9b5059-1b3e-4067-a63d-2952cbe863af\" (UID: \"9e9b5059-1b3e-4067-a63d-2952cbe863af\") " Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.022114 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" event={"ID":"17b87002-b798-480a-8e17-83053d698239","Type":"ContainerStarted","Data":"936d15425a95ca7340bf36b5d29e85e0b504aee88e45479073a76084b212cad1"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.022321 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fhkjl" event={"ID":"17b87002-b798-480a-8e17-83053d698239","Type":"ContainerStarted","Data":"51467a890456af3ec283aae28bac52ca164efa856026a21eda41dfc79e8cffb4"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.022840 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.025471 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" event={"ID":"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141","Type":"ContainerStarted","Data":"037bb9ee1a61b40ccd405d0e4fad4e60c2db137b49afe4bcb9c35c5d0f724be3"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.025502 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5ff7774fd9-nljh6" event={"ID":"6a9ae5f6-97bd-46ac-bafa-ca1b4452a141","Type":"ContainerStarted","Data":"b16e7159c9d6254a30e19b529014e9d9930a702bf13c8c0364e78a839e55a397"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.028520 5116 generic.go:358] "Generic (PLEG): container finished" podID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerID="f6ccc3cf8e5e1fac21a450937d818f73d9c8ea21d213cc087495650a551817ba" exitCode=0 Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.028627 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerDied","Data":"f6ccc3cf8e5e1fac21a450937d818f73d9c8ea21d213cc087495650a551817ba"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.033622 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (OuterVolumeSpecName: "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2") pod "9e9b5059-1b3e-4067-a63d-2952cbe863af" (UID: "9e9b5059-1b3e-4067-a63d-2952cbe863af"). InnerVolumeSpecName "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2". PluginName "kubernetes.io/csi", VolumeGIDValue "" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.033701 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" event={"ID":"f863fff9-286a-45fa-b8f0-8a86994b8440","Type":"ContainerStarted","Data":"e6c5bea9360e0d6bf8238510407377ca2564dff36e62aa24d3ae073673381431"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.033752 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-5bb8f5cd97-xdvz5" event={"ID":"f863fff9-286a-45fa-b8f0-8a86994b8440","Type":"ContainerStarted","Data":"a0d97d493c5b4a9902109682c624deddf807b1f2688a8dd2bc5d8fbe7851a740"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.046207 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-11-crc" podStartSLOduration=3.046160594 podStartE2EDuration="3.046160594s" podCreationTimestamp="2026-03-22 00:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:56.032686147 +0000 UTC m=+127.054987520" watchObservedRunningTime="2026-03-22 00:10:56.046160594 +0000 UTC m=+127.068461967" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.052038 5116 generic.go:358] "Generic (PLEG): container finished" podID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerID="95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe" exitCode=0 Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.052330 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerDied","Data":"95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.052391 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerStarted","Data":"5618b991bbeac41e6785299c47466f86c4e18d2882dd3d7064face6121177c93"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.061640 5116 generic.go:358] "Generic (PLEG): container finished" podID="113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" containerID="cce1736020766020678b302f9485885375b60ee80f25f29a702ff1bc84c0c923" exitCode=0 Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.061745 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" event={"ID":"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1","Type":"ContainerDied","Data":"cce1736020766020678b302f9485885375b60ee80f25f29a702ff1bc84c0c923"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.068060 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" event={"ID":"1bb9e03f-ef85-4dbe-802f-d529e97b092c","Type":"ContainerStarted","Data":"5aba16f642e38c56ebd13cbed0fe8ffe84084bc59fa041b240ad7c62484dc1fe"} Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.218056 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.227280 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.701731 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-7d4fc7d867-td5gr" Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.772388 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.934719 5116 patch_prober.go:28] interesting pod/router-default-68cf44c8b8-2jlxw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 22 00:10:56 crc kubenswrapper[5116]: [-]has-synced failed: reason withheld Mar 22 00:10:56 crc kubenswrapper[5116]: [+]process-running ok Mar 22 00:10:56 crc kubenswrapper[5116]: healthz check failed Mar 22 00:10:56 crc kubenswrapper[5116]: I0322 00:10:56.934784 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" podUID="00dffd10-d567-431f-8dd9-390443f26d96" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.080010 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" event={"ID":"1bb9e03f-ef85-4dbe-802f-d529e97b092c","Type":"ContainerStarted","Data":"28cf8da7c74cf08d2480113a46e665cb47e7c90c0ce62edd51dce536553dc5c7"} Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.081141 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" event={"ID":"36ff6a0d-ec37-48dd-9e2b-01bcb5755738","Type":"ContainerStarted","Data":"bca4de0caba9859b14c3f0eb17a3776e71425e24f9833420070510404cc3406c"} Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.082698 5116 generic.go:358] "Generic (PLEG): container finished" podID="087356d8-050c-4861-8799-76df7a8330cb" containerID="130739c8c4fc9560744543b32a2dedebe00df0b00c06be853ce09fafeeaac564" exitCode=0 Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.082887 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"087356d8-050c-4861-8799-76df7a8330cb","Type":"ContainerDied","Data":"130739c8c4fc9560744543b32a2dedebe00df0b00c06be853ce09fafeeaac564"} Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.123876 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-mzck5" podStartSLOduration=20.123853971 podStartE2EDuration="20.123853971s" podCreationTimestamp="2026-03-22 00:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:57.104007611 +0000 UTC m=+128.126308984" watchObservedRunningTime="2026-03-22 00:10:57.123853971 +0000 UTC m=+128.146155344" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.355478 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.372542 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume\") pod \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.372719 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clsgn\" (UniqueName: \"kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn\") pod \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.372752 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume\") pod \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\" (UID: \"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1\") " Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.373664 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume" (OuterVolumeSpecName: "config-volume") pod "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" (UID: "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.379827 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn" (OuterVolumeSpecName: "kube-api-access-clsgn") pod "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" (UID: "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1"). InnerVolumeSpecName "kube-api-access-clsgn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.380375 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" (UID: "113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.474822 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clsgn\" (UniqueName: \"kubernetes.io/projected/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-kube-api-access-clsgn\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.474864 5116 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.474877 5116 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.718031 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e9b5059-1b3e-4067-a63d-2952cbe863af" path="/var/lib/kubelet/pods/9e9b5059-1b3e-4067-a63d-2952cbe863af/volumes" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.943114 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:57 crc kubenswrapper[5116]: I0322 00:10:57.948059 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-68cf44c8b8-2jlxw" Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.090734 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" event={"ID":"113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1","Type":"ContainerDied","Data":"dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9"} Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.090779 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dae1818a1082f52b271c6e67aa4b0cda2f80f759d34e5d7f0f380b84d59344e9" Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.090883 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568960-fmcxl" Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.095352 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" event={"ID":"36ff6a0d-ec37-48dd-9e2b-01bcb5755738","Type":"ContainerStarted","Data":"c568767e3e8c6d00f1721bf03851b9ec54aeb271e4894892ac0cc16a9a33722c"} Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.096200 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:10:58 crc kubenswrapper[5116]: I0322 00:10:58.116193 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" podStartSLOduration=108.116110596 podStartE2EDuration="1m48.116110596s" podCreationTimestamp="2026-03-22 00:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:10:58.113567945 +0000 UTC m=+129.135869338" watchObservedRunningTime="2026-03-22 00:10:58.116110596 +0000 UTC m=+129.138411969" Mar 22 00:10:59 crc kubenswrapper[5116]: I0322 00:10:59.556502 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:10:59 crc kubenswrapper[5116]: I0322 00:10:59.556902 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:11:00 crc kubenswrapper[5116]: I0322 00:11:00.462502 5116 patch_prober.go:28] interesting pod/console-64d44f6ddf-9g5sg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 22 00:11:00 crc kubenswrapper[5116]: I0322 00:11:00.462570 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-64d44f6ddf-9g5sg" podUID="4c2755ce-817d-47b0-9f19-7218641d0c5b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 22 00:11:00 crc kubenswrapper[5116]: I0322 00:11:00.839099 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:11:00 crc kubenswrapper[5116]: I0322 00:11:00.878075 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.026254 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir\") pod \"087356d8-050c-4861-8799-76df7a8330cb\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.026402 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "087356d8-050c-4861-8799-76df7a8330cb" (UID: "087356d8-050c-4861-8799-76df7a8330cb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.026508 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access\") pod \"087356d8-050c-4861-8799-76df7a8330cb\" (UID: \"087356d8-050c-4861-8799-76df7a8330cb\") " Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.026766 5116 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/087356d8-050c-4861-8799-76df7a8330cb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.034111 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "087356d8-050c-4861-8799-76df7a8330cb" (UID: "087356d8-050c-4861-8799-76df7a8330cb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.120432 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-11-crc" event={"ID":"087356d8-050c-4861-8799-76df7a8330cb","Type":"ContainerDied","Data":"25a14810c25178d5701d5a77da67d4d6de2709a0d6bd4a338a91bc57a5e63c27"} Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.120480 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25a14810c25178d5701d5a77da67d4d6de2709a0d6bd4a338a91bc57a5e63c27" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.120600 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-11-crc" Mar 22 00:11:01 crc kubenswrapper[5116]: I0322 00:11:01.127414 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/087356d8-050c-4861-8799-76df7a8330cb-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:02 crc kubenswrapper[5116]: I0322 00:11:02.169136 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:11:02 crc kubenswrapper[5116]: I0322 00:11:02.497237 5116 patch_prober.go:28] interesting pod/downloads-747b44746d-cb5p2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 22 00:11:02 crc kubenswrapper[5116]: I0322 00:11:02.497646 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-747b44746d-cb5p2" podUID="45fa64e1-27bb-4b1f-bf62-4fa08b5dcfa0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 22 00:11:03 crc kubenswrapper[5116]: I0322 00:11:03.325792 5116 ???:1] "http: TLS handshake error from 192.168.126.11:52016: no serving certificate available for the kubelet" Mar 22 00:11:04 crc kubenswrapper[5116]: E0322 00:11:04.968322 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:04 crc kubenswrapper[5116]: E0322 00:11:04.970625 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:04 crc kubenswrapper[5116]: E0322 00:11:04.971683 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:04 crc kubenswrapper[5116]: E0322 00:11:04.971726 5116 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Mar 22 00:11:10 crc kubenswrapper[5116]: I0322 00:11:10.480600 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:11:10 crc kubenswrapper[5116]: I0322 00:11:10.491966 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64d44f6ddf-9g5sg" Mar 22 00:11:12 crc kubenswrapper[5116]: I0322 00:11:12.513650 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-747b44746d-cb5p2" Mar 22 00:11:14 crc kubenswrapper[5116]: E0322 00:11:14.968329 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:14 crc kubenswrapper[5116]: E0322 00:11:14.969576 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:14 crc kubenswrapper[5116]: E0322 00:11:14.970891 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 22 00:11:14 crc kubenswrapper[5116]: E0322 00:11:14.970931 5116 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" probeResult="unknown" Mar 22 00:11:15 crc kubenswrapper[5116]: W0322 00:11:15.735803 5116 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fbdb0d_3da3_4d36_9a96_4ed0caa53799.slice/crio-conmon-bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fbdb0d_3da3_4d36_9a96_4ed0caa53799.slice/crio-conmon-bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867.scope: no such file or directory Mar 22 00:11:15 crc kubenswrapper[5116]: W0322 00:11:15.736094 5116 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fbdb0d_3da3_4d36_9a96_4ed0caa53799.slice/crio-bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fbdb0d_3da3_4d36_9a96_4ed0caa53799.slice/crio-bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867.scope: no such file or directory Mar 22 00:11:15 crc kubenswrapper[5116]: W0322 00:11:15.750901 5116 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-pod087356d8_050c_4861_8799_76df7a8330cb.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-pod087356d8_050c_4861_8799_76df7a8330cb.slice: no such file or directory Mar 22 00:11:15 crc kubenswrapper[5116]: E0322 00:11:15.844334 5116 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3b0eb3_e48f_4080_bfdc_522f18cf2876.slice/crio-48d478e51d471e4228cfd3bf987de6d6edc345926b49b645231a761cba1fbba7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113ffd3f_0faf_40f9_b1ab_0c7b88fc90f1.slice/crio-conmon-cce1736020766020678b302f9485885375b60ee80f25f29a702ff1bc84c0c923.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77380b82_4c44_4cfd_a7b1_e77b060af507.slice/crio-650e2199c9bfdd8d8095c16f7e86df5e23a5cedd710ebf3a93a1b3818c4e5743.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod06481163_cdc1_4b43_b6c2_73f2672feb42.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113ffd3f_0faf_40f9_b1ab_0c7b88fc90f1.slice/crio-cce1736020766020678b302f9485885375b60ee80f25f29a702ff1bc84c0c923.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d2a94f_b4d4_4cdc_b862_a4866cadaea1.slice/crio-a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77380b82_4c44_4cfd_a7b1_e77b060af507.slice/crio-conmon-650e2199c9bfdd8d8095c16f7e86df5e23a5cedd710ebf3a93a1b3818c4e5743.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod06481163_cdc1_4b43_b6c2_73f2672feb42.slice/crio-5fd306e69fede60242c9ecd2db58c2c9e041c89fdb800077d2c6965a559bbb92\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113ffd3f_0faf_40f9_b1ab_0c7b88fc90f1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3b0eb3_e48f_4080_bfdc_522f18cf2876.slice/crio-conmon-48d478e51d471e4228cfd3bf987de6d6edc345926b49b645231a761cba1fbba7.scope\": RecentStats: unable to find data in memory cache]" Mar 22 00:11:16 crc kubenswrapper[5116]: I0322 00:11:16.500615 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-57nbs_f1d2a94f-b4d4-4cdc-b862-a4866cadaea1/kube-multus-additional-cni-plugins/0.log" Mar 22 00:11:16 crc kubenswrapper[5116]: I0322 00:11:16.501441 5116 generic.go:358] "Generic (PLEG): container finished" podID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" exitCode=137 Mar 22 00:11:16 crc kubenswrapper[5116]: I0322 00:11:16.501566 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" event={"ID":"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1","Type":"ContainerDied","Data":"a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e"} Mar 22 00:11:16 crc kubenswrapper[5116]: I0322 00:11:16.701107 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-77f986bd66-ms24k" Mar 22 00:11:20 crc kubenswrapper[5116]: I0322 00:11:20.111963 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.751034 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-57nbs_f1d2a94f-b4d4-4cdc-b862-a4866cadaea1/kube-multus-additional-cni-plugins/0.log" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.751128 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.767239 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready\") pod \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.767398 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mkbf\" (UniqueName: \"kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf\") pod \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.767465 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist\") pod \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.767529 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir\") pod \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\" (UID: \"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1\") " Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.767752 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" (UID: "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.768197 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready" (OuterVolumeSpecName: "ready") pod "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" (UID: "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.768577 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" (UID: "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.774986 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf" (OuterVolumeSpecName: "kube-api-access-7mkbf") pod "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" (UID: "f1d2a94f-b4d4-4cdc-b862-a4866cadaea1"). InnerVolumeSpecName "kube-api-access-7mkbf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.868986 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mkbf\" (UniqueName: \"kubernetes.io/projected/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-kube-api-access-7mkbf\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.869279 5116 reconciler_common.go:299] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.869292 5116 reconciler_common.go:299] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:21 crc kubenswrapper[5116]: I0322 00:11:21.869328 5116 reconciler_common.go:299] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1-ready\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.543096 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-57nbs_f1d2a94f-b4d4-4cdc-b862-a4866cadaea1/kube-multus-additional-cni-plugins/0.log" Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.543893 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.545387 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-57nbs" event={"ID":"f1d2a94f-b4d4-4cdc-b862-a4866cadaea1","Type":"ContainerDied","Data":"03b66e449ae75da9ca6d01567116f2bb706bdaf3b8c1863bf017db47b855716d"} Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.545440 5116 scope.go:117] "RemoveContainer" containerID="a68815d0fdf2b53a6a0e9f938fc585a599e79f4f7ad52f8b29c63ba519d0d23e" Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.552087 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerStarted","Data":"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79"} Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.595438 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-57nbs"] Mar 22 00:11:22 crc kubenswrapper[5116]: I0322 00:11:22.602232 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-57nbs"] Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.559735 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerStarted","Data":"4615b1e7e8eebd1c4efa8ba0e4d690678c50216687ca96316a048717b240afa6"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.562885 5116 generic.go:358] "Generic (PLEG): container finished" podID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerID="b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.563000 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerDied","Data":"b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.569605 5116 generic.go:358] "Generic (PLEG): container finished" podID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerID="3662e71bd41b60c7bbef1f51273ae388448fc2e3a846e9f692b29bbba4929dce" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.569733 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerDied","Data":"3662e71bd41b60c7bbef1f51273ae388448fc2e3a846e9f692b29bbba4929dce"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.573515 5116 generic.go:358] "Generic (PLEG): container finished" podID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerID="bd585f6a2418bf617978e44c5ded778fb5ab883949c7c6d99346b0cce7aab8d6" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.573616 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4x6l" event={"ID":"da3b0eb3-e48f-4080-bfdc-522f18cf2876","Type":"ContainerDied","Data":"bd585f6a2418bf617978e44c5ded778fb5ab883949c7c6d99346b0cce7aab8d6"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.590419 5116 generic.go:358] "Generic (PLEG): container finished" podID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerID="1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.590621 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerDied","Data":"1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.592353 5116 generic.go:358] "Generic (PLEG): container finished" podID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerID="96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.592432 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerDied","Data":"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.596425 5116 generic.go:358] "Generic (PLEG): container finished" podID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerID="9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.596795 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerDied","Data":"9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.606998 5116 generic.go:358] "Generic (PLEG): container finished" podID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerID="30c022eef87348aae4e9bdbc424e5f6c1baa0356ea65b1994f9191806ffd90dd" exitCode=0 Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.607247 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerDied","Data":"30c022eef87348aae4e9bdbc424e5f6c1baa0356ea65b1994f9191806ffd90dd"} Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.721259 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" path="/var/lib/kubelet/pods/f1d2a94f-b4d4-4cdc-b862-a4866cadaea1/volumes" Mar 22 00:11:23 crc kubenswrapper[5116]: I0322 00:11:23.847328 5116 ???:1] "http: TLS handshake error from 192.168.126.11:38976: no serving certificate available for the kubelet" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.617416 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerStarted","Data":"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.621323 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerStarted","Data":"58537fdac1ecd65bbab90d0d56679e194c7535455e823fc1086706266b101727"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.624938 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4x6l" event={"ID":"da3b0eb3-e48f-4080-bfdc-522f18cf2876","Type":"ContainerStarted","Data":"fcd6dc8707ea620677d752dcbac5f99025670bc7fb912c816a48189214a21722"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.627951 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerStarted","Data":"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.634540 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerStarted","Data":"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.637959 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerStarted","Data":"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.640606 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerStarted","Data":"c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.644412 5116 generic.go:358] "Generic (PLEG): container finished" podID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerID="4615b1e7e8eebd1c4efa8ba0e4d690678c50216687ca96316a048717b240afa6" exitCode=0 Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.644472 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerDied","Data":"4615b1e7e8eebd1c4efa8ba0e4d690678c50216687ca96316a048717b240afa6"} Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.668077 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-npbn6" podStartSLOduration=7.688369582 podStartE2EDuration="35.668052984s" podCreationTimestamp="2026-03-22 00:10:49 +0000 UTC" firstStartedPulling="2026-03-22 00:10:54.117970129 +0000 UTC m=+125.140271502" lastFinishedPulling="2026-03-22 00:11:22.097653511 +0000 UTC m=+153.119954904" observedRunningTime="2026-03-22 00:11:24.6647619 +0000 UTC m=+155.687063293" watchObservedRunningTime="2026-03-22 00:11:24.668052984 +0000 UTC m=+155.690354347" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.669673 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fbgnq" podStartSLOduration=6.622493156 podStartE2EDuration="32.669646935s" podCreationTimestamp="2026-03-22 00:10:52 +0000 UTC" firstStartedPulling="2026-03-22 00:10:56.053998923 +0000 UTC m=+127.076300286" lastFinishedPulling="2026-03-22 00:11:22.101152692 +0000 UTC m=+153.123454065" observedRunningTime="2026-03-22 00:11:24.643470587 +0000 UTC m=+155.665771970" watchObservedRunningTime="2026-03-22 00:11:24.669646935 +0000 UTC m=+155.691948308" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.683311 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wlccm" podStartSLOduration=6.799833974 podStartE2EDuration="33.683286856s" podCreationTimestamp="2026-03-22 00:10:51 +0000 UTC" firstStartedPulling="2026-03-22 00:10:54.958520081 +0000 UTC m=+125.980821444" lastFinishedPulling="2026-03-22 00:11:21.841972953 +0000 UTC m=+152.864274326" observedRunningTime="2026-03-22 00:11:24.682190051 +0000 UTC m=+155.704491424" watchObservedRunningTime="2026-03-22 00:11:24.683286856 +0000 UTC m=+155.705588229" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.746910 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kp7rb" podStartSLOduration=7.665904741 podStartE2EDuration="33.746894955s" podCreationTimestamp="2026-03-22 00:10:51 +0000 UTC" firstStartedPulling="2026-03-22 00:10:56.017484124 +0000 UTC m=+127.039785497" lastFinishedPulling="2026-03-22 00:11:22.098474318 +0000 UTC m=+153.120775711" observedRunningTime="2026-03-22 00:11:24.744655045 +0000 UTC m=+155.766956438" watchObservedRunningTime="2026-03-22 00:11:24.746894955 +0000 UTC m=+155.769196328" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.747338 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zrcmf" podStartSLOduration=6.3151667719999995 podStartE2EDuration="35.747331619s" podCreationTimestamp="2026-03-22 00:10:49 +0000 UTC" firstStartedPulling="2026-03-22 00:10:52.317468208 +0000 UTC m=+123.339769581" lastFinishedPulling="2026-03-22 00:11:21.749633045 +0000 UTC m=+152.771934428" observedRunningTime="2026-03-22 00:11:24.709951658 +0000 UTC m=+155.732253041" watchObservedRunningTime="2026-03-22 00:11:24.747331619 +0000 UTC m=+155.769632992" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.772388 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t4x6l" podStartSLOduration=6.24798832 podStartE2EDuration="35.77236138s" podCreationTimestamp="2026-03-22 00:10:49 +0000 UTC" firstStartedPulling="2026-03-22 00:10:52.317670385 +0000 UTC m=+123.339971768" lastFinishedPulling="2026-03-22 00:11:21.842043415 +0000 UTC m=+152.864344828" observedRunningTime="2026-03-22 00:11:24.770697607 +0000 UTC m=+155.792998990" watchObservedRunningTime="2026-03-22 00:11:24.77236138 +0000 UTC m=+155.794662753" Mar 22 00:11:24 crc kubenswrapper[5116]: I0322 00:11:24.797217 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hppqv" podStartSLOduration=6.234293697 podStartE2EDuration="35.797196625s" podCreationTimestamp="2026-03-22 00:10:49 +0000 UTC" firstStartedPulling="2026-03-22 00:10:52.316553979 +0000 UTC m=+123.338855352" lastFinishedPulling="2026-03-22 00:11:21.879456907 +0000 UTC m=+152.901758280" observedRunningTime="2026-03-22 00:11:24.79450112 +0000 UTC m=+155.816802503" watchObservedRunningTime="2026-03-22 00:11:24.797196625 +0000 UTC m=+155.819498008" Mar 22 00:11:25 crc kubenswrapper[5116]: I0322 00:11:25.653623 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerStarted","Data":"a7f6a2282b3b24b75800d6fb74f2f5336f0fdfcc664abae1ceae752dc767bce9"} Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.641912 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wss9d" podStartSLOduration=8.53420339 podStartE2EDuration="34.641892965s" podCreationTimestamp="2026-03-22 00:10:52 +0000 UTC" firstStartedPulling="2026-03-22 00:10:56.029603999 +0000 UTC m=+127.051905372" lastFinishedPulling="2026-03-22 00:11:22.137293574 +0000 UTC m=+153.159594947" observedRunningTime="2026-03-22 00:11:25.682323074 +0000 UTC m=+156.704624447" watchObservedRunningTime="2026-03-22 00:11:26.641892965 +0000 UTC m=+157.664194338" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.643571 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644223 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644244 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644262 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="087356d8-050c-4861-8799-76df7a8330cb" containerName="pruner" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644268 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="087356d8-050c-4861-8799-76df7a8330cb" containerName="pruner" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644280 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" containerName="collect-profiles" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644287 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" containerName="collect-profiles" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644384 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="087356d8-050c-4861-8799-76df7a8330cb" containerName="pruner" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644395 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1d2a94f-b4d4-4cdc-b862-a4866cadaea1" containerName="kube-multus-additional-cni-plugins" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.644408 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="113ffd3f-0faf-40f9-b1ab-0c7b88fc90f1" containerName="collect-profiles" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.652471 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.654232 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.654279 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.654357 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.729132 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.729234 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.830252 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.830542 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.830996 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.859067 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access\") pod \"revision-pruner-12-crc\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:26 crc kubenswrapper[5116]: I0322 00:11:26.971769 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:27 crc kubenswrapper[5116]: I0322 00:11:27.087725 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fhkjl" Mar 22 00:11:27 crc kubenswrapper[5116]: I0322 00:11:27.381233 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-12-crc"] Mar 22 00:11:27 crc kubenswrapper[5116]: W0322 00:11:27.387414 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf36eaa2d_3ae4_4e89_991a_f7f42d317944.slice/crio-9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab WatchSource:0}: Error finding container 9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab: Status 404 returned error can't find the container with id 9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab Mar 22 00:11:27 crc kubenswrapper[5116]: I0322 00:11:27.667825 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"f36eaa2d-3ae4-4e89-991a-f7f42d317944","Type":"ContainerStarted","Data":"9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab"} Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.696360 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.707708 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.816599 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.817012 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.959906 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:11:29 crc kubenswrapper[5116]: I0322 00:11:29.963298 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.160077 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.160534 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.203018 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.278697 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.278733 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.329689 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.689178 5116 generic.go:358] "Generic (PLEG): container finished" podID="f36eaa2d-3ae4-4e89-991a-f7f42d317944" containerID="661758079b22658bb8c2b551471b833a0b0446a747ae3daba963976c194cf27a" exitCode=0 Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.689256 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"f36eaa2d-3ae4-4e89-991a-f7f42d317944","Type":"ContainerDied","Data":"661758079b22658bb8c2b551471b833a0b0446a747ae3daba963976c194cf27a"} Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.736368 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.736962 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.745510 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:11:30 crc kubenswrapper[5116]: I0322 00:11:30.745729 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:31 crc kubenswrapper[5116]: I0322 00:11:31.909537 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.002035 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir\") pod \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.002160 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access\") pod \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\" (UID: \"f36eaa2d-3ae4-4e89-991a-f7f42d317944\") " Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.002153 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f36eaa2d-3ae4-4e89-991a-f7f42d317944" (UID: "f36eaa2d-3ae4-4e89-991a-f7f42d317944"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.002419 5116 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.010127 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f36eaa2d-3ae4-4e89-991a-f7f42d317944" (UID: "f36eaa2d-3ae4-4e89-991a-f7f42d317944"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.076774 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.103243 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f36eaa2d-3ae4-4e89-991a-f7f42d317944-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.274115 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.687601 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.687669 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.703526 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-12-crc" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.703559 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-12-crc" event={"ID":"f36eaa2d-3ae4-4e89-991a-f7f42d317944","Type":"ContainerDied","Data":"9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab"} Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.703599 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d0baae26ab41b01ca788cfd1236f141d1054c48d039a7574e2a9713b5c765ab" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.703961 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hppqv" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="registry-server" containerID="cri-o://85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f" gracePeriod=2 Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.757935 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.790387 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.790669 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.812983 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.891565 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.891626 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.914288 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:32 crc kubenswrapper[5116]: I0322 00:11:32.930977 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.575210 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.626897 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4dfx\" (UniqueName: \"kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx\") pod \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.627009 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities\") pod \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.627074 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content\") pod \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\" (UID: \"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70\") " Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.629332 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities" (OuterVolumeSpecName: "utilities") pod "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" (UID: "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.637645 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx" (OuterVolumeSpecName: "kube-api-access-c4dfx") pod "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" (UID: "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70"). InnerVolumeSpecName "kube-api-access-c4dfx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.669950 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" (UID: "1c6b741d-fb0d-4bb1-a050-a2e56bd11e70"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.711405 5116 generic.go:358] "Generic (PLEG): container finished" podID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerID="85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f" exitCode=0 Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.711489 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerDied","Data":"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f"} Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.711543 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hppqv" event={"ID":"1c6b741d-fb0d-4bb1-a050-a2e56bd11e70","Type":"ContainerDied","Data":"58966d6ce1821ab01be1615acae0ea7c0ed73c7caaf2ca09de37bd0dfaf8db3c"} Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.711544 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hppqv" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.711570 5116 scope.go:117] "RemoveContainer" containerID="85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.712099 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-npbn6" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="registry-server" containerID="cri-o://d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354" gracePeriod=2 Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.728716 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.728781 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.728797 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c4dfx\" (UniqueName: \"kubernetes.io/projected/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70-kube-api-access-c4dfx\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.735921 5116 scope.go:117] "RemoveContainer" containerID="96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.754410 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.758899 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.759255 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hppqv"] Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.766947 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.770855 5116 scope.go:117] "RemoveContainer" containerID="4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.823734 5116 scope.go:117] "RemoveContainer" containerID="85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f" Mar 22 00:11:33 crc kubenswrapper[5116]: E0322 00:11:33.824232 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f\": container with ID starting with 85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f not found: ID does not exist" containerID="85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.824267 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f"} err="failed to get container status \"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f\": rpc error: code = NotFound desc = could not find container \"85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f\": container with ID starting with 85225f683a8985a92bc5f6a4df8ef21ff41480ec9e3be2c91f882ab5aa940b5f not found: ID does not exist" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.824312 5116 scope.go:117] "RemoveContainer" containerID="96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79" Mar 22 00:11:33 crc kubenswrapper[5116]: E0322 00:11:33.824733 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79\": container with ID starting with 96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79 not found: ID does not exist" containerID="96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.824764 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79"} err="failed to get container status \"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79\": rpc error: code = NotFound desc = could not find container \"96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79\": container with ID starting with 96e41905d769f207cae9700db4a07b0da2c414c390460451815b728a092aab79 not found: ID does not exist" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.824780 5116 scope.go:117] "RemoveContainer" containerID="4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791" Mar 22 00:11:33 crc kubenswrapper[5116]: E0322 00:11:33.825058 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791\": container with ID starting with 4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791 not found: ID does not exist" containerID="4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791" Mar 22 00:11:33 crc kubenswrapper[5116]: I0322 00:11:33.825088 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791"} err="failed to get container status \"4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791\": rpc error: code = NotFound desc = could not find container \"4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791\": container with ID starting with 4ef7e2e5385256cc0a82e70e80e8f709c236aa9f928adb0c844bf2af996e6791 not found: ID does not exist" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.034681 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.133134 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbjjc\" (UniqueName: \"kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc\") pod \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.134042 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities\") pod \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.134094 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content\") pod \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\" (UID: \"09fbdb0d-3da3-4d36-9a96-4ed0caa53799\") " Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.135764 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities" (OuterVolumeSpecName: "utilities") pod "09fbdb0d-3da3-4d36-9a96-4ed0caa53799" (UID: "09fbdb0d-3da3-4d36-9a96-4ed0caa53799"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.139901 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc" (OuterVolumeSpecName: "kube-api-access-dbjjc") pod "09fbdb0d-3da3-4d36-9a96-4ed0caa53799" (UID: "09fbdb0d-3da3-4d36-9a96-4ed0caa53799"). InnerVolumeSpecName "kube-api-access-dbjjc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.218948 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09fbdb0d-3da3-4d36-9a96-4ed0caa53799" (UID: "09fbdb0d-3da3-4d36-9a96-4ed0caa53799"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.236600 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dbjjc\" (UniqueName: \"kubernetes.io/projected/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-kube-api-access-dbjjc\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.236669 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.236682 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09fbdb0d-3da3-4d36-9a96-4ed0caa53799-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.441473 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.441977 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.496501 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.722767 5116 generic.go:358] "Generic (PLEG): container finished" podID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerID="d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354" exitCode=0 Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.722846 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerDied","Data":"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354"} Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.722906 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npbn6" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.722951 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npbn6" event={"ID":"09fbdb0d-3da3-4d36-9a96-4ed0caa53799","Type":"ContainerDied","Data":"45e4dcc16acc5b874128c4be6959ee16a39761f0d8b86acfc6b789f6f799c066"} Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.722985 5116 scope.go:117] "RemoveContainer" containerID="d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.745114 5116 scope.go:117] "RemoveContainer" containerID="1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.757654 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.762466 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-npbn6"] Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.779936 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.794190 5116 scope.go:117] "RemoveContainer" containerID="bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.812988 5116 scope.go:117] "RemoveContainer" containerID="d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354" Mar 22 00:11:34 crc kubenswrapper[5116]: E0322 00:11:34.814562 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354\": container with ID starting with d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354 not found: ID does not exist" containerID="d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.814605 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354"} err="failed to get container status \"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354\": rpc error: code = NotFound desc = could not find container \"d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354\": container with ID starting with d02b073538fa9eb3a597c60e947eebf2c3d5fa91c1caac83fb6bdc09fab1e354 not found: ID does not exist" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.814630 5116 scope.go:117] "RemoveContainer" containerID="1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299" Mar 22 00:11:34 crc kubenswrapper[5116]: E0322 00:11:34.815652 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299\": container with ID starting with 1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299 not found: ID does not exist" containerID="1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.815706 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299"} err="failed to get container status \"1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299\": rpc error: code = NotFound desc = could not find container \"1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299\": container with ID starting with 1bec1f695abda26181894012664861851610963e95a85fd4d9deba481ae99299 not found: ID does not exist" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.815739 5116 scope.go:117] "RemoveContainer" containerID="bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867" Mar 22 00:11:34 crc kubenswrapper[5116]: E0322 00:11:34.816096 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867\": container with ID starting with bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867 not found: ID does not exist" containerID="bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867" Mar 22 00:11:34 crc kubenswrapper[5116]: I0322 00:11:34.816123 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867"} err="failed to get container status \"bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867\": rpc error: code = NotFound desc = could not find container \"bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867\": container with ID starting with bbda729440934768bb562f9328428c91354391de8fb49c00d7a6e4a343b96867 not found: ID does not exist" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.439724 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440331 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440344 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440359 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f36eaa2d-3ae4-4e89-991a-f7f42d317944" containerName="pruner" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440364 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="f36eaa2d-3ae4-4e89-991a-f7f42d317944" containerName="pruner" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440373 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="extract-utilities" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440380 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="extract-utilities" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440393 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="extract-utilities" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440399 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="extract-utilities" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440408 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="extract-content" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440413 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="extract-content" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440420 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440425 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440439 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="extract-content" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440444 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="extract-content" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440549 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440562 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" containerName="registry-server" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.440573 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="f36eaa2d-3ae4-4e89-991a-f7f42d317944" containerName="pruner" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.446042 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.454727 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.459593 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver\"/\"installer-sa-dockercfg-bqqnb\"" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.459947 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.553946 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.554004 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.554024 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.655416 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.655489 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.655530 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.655566 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.655654 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.680740 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access\") pod \"installer-12-crc\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.704915 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09fbdb0d-3da3-4d36-9a96-4ed0caa53799" path="/var/lib/kubelet/pods/09fbdb0d-3da3-4d36-9a96-4ed0caa53799/volumes" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.705634 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6b741d-fb0d-4bb1-a050-a2e56bd11e70" path="/var/lib/kubelet/pods/1c6b741d-fb0d-4bb1-a050-a2e56bd11e70/volumes" Mar 22 00:11:35 crc kubenswrapper[5116]: I0322 00:11:35.763413 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.197626 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-12-crc"] Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.471162 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.670135 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.740539 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"d04f6d8c-7814-4e1f-8000-afd2938eb5db","Type":"ContainerStarted","Data":"ee145f68231146608459069e1444e4a54019fcaddb68827cf1a4f7b9bb7e9250"} Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.740584 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"d04f6d8c-7814-4e1f-8000-afd2938eb5db","Type":"ContainerStarted","Data":"6fa767d4004b7071ab450a3ff465069b85b54dadf7bad861245cb4344509496d"} Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.740977 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wlccm" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="registry-server" containerID="cri-o://27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07" gracePeriod=2 Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.741053 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fbgnq" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="registry-server" containerID="cri-o://e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5" gracePeriod=2 Mar 22 00:11:36 crc kubenswrapper[5116]: I0322 00:11:36.770577 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-12-crc" podStartSLOduration=1.7705591410000001 podStartE2EDuration="1.770559141s" podCreationTimestamp="2026-03-22 00:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:11:36.76860125 +0000 UTC m=+167.790902643" watchObservedRunningTime="2026-03-22 00:11:36.770559141 +0000 UTC m=+167.792860514" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.174003 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.184017 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.275958 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities\") pod \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.276116 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvgbs\" (UniqueName: \"kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs\") pod \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.276212 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content\") pod \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.276281 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities\") pod \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.276313 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cksgt\" (UniqueName: \"kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt\") pod \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\" (UID: \"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.276360 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content\") pod \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\" (UID: \"29adb7c6-6fa5-4af7-9007-dc22cf4598e7\") " Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.277314 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities" (OuterVolumeSpecName: "utilities") pod "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" (UID: "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.277371 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities" (OuterVolumeSpecName: "utilities") pod "29adb7c6-6fa5-4af7-9007-dc22cf4598e7" (UID: "29adb7c6-6fa5-4af7-9007-dc22cf4598e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.285128 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs" (OuterVolumeSpecName: "kube-api-access-rvgbs") pod "29adb7c6-6fa5-4af7-9007-dc22cf4598e7" (UID: "29adb7c6-6fa5-4af7-9007-dc22cf4598e7"). InnerVolumeSpecName "kube-api-access-rvgbs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.285288 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt" (OuterVolumeSpecName: "kube-api-access-cksgt") pod "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" (UID: "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e"). InnerVolumeSpecName "kube-api-access-cksgt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.297946 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" (UID: "77aaaf4b-1fdf-4c49-9e45-86aabb6f007e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.377941 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.377985 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rvgbs\" (UniqueName: \"kubernetes.io/projected/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-kube-api-access-rvgbs\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.377999 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.378012 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.378023 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cksgt\" (UniqueName: \"kubernetes.io/projected/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e-kube-api-access-cksgt\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.410072 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29adb7c6-6fa5-4af7-9007-dc22cf4598e7" (UID: "29adb7c6-6fa5-4af7-9007-dc22cf4598e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.479350 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29adb7c6-6fa5-4af7-9007-dc22cf4598e7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.749828 5116 generic.go:358] "Generic (PLEG): container finished" podID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerID="27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07" exitCode=0 Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.749938 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerDied","Data":"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07"} Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.750150 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wlccm" event={"ID":"77aaaf4b-1fdf-4c49-9e45-86aabb6f007e","Type":"ContainerDied","Data":"96be2eef25c44ecc457a8a1fa10ad35be205dda004792bd0cccdb43de654bdc4"} Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.749987 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wlccm" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.750189 5116 scope.go:117] "RemoveContainer" containerID="27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.761430 5116 generic.go:358] "Generic (PLEG): container finished" podID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerID="e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5" exitCode=0 Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.762384 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbgnq" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.762538 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerDied","Data":"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5"} Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.762573 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbgnq" event={"ID":"29adb7c6-6fa5-4af7-9007-dc22cf4598e7","Type":"ContainerDied","Data":"5618b991bbeac41e6785299c47466f86c4e18d2882dd3d7064face6121177c93"} Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.780752 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.781273 5116 scope.go:117] "RemoveContainer" containerID="9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.788276 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wlccm"] Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.798669 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.799985 5116 scope.go:117] "RemoveContainer" containerID="8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.803018 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fbgnq"] Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.817650 5116 scope.go:117] "RemoveContainer" containerID="27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.818152 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07\": container with ID starting with 27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07 not found: ID does not exist" containerID="27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818211 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07"} err="failed to get container status \"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07\": rpc error: code = NotFound desc = could not find container \"27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07\": container with ID starting with 27a93aeb60dad7dc0e6f4ee76954f9c95ea776bfea40c18f6141dcbc7dbb9e07 not found: ID does not exist" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818239 5116 scope.go:117] "RemoveContainer" containerID="9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.818550 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a\": container with ID starting with 9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a not found: ID does not exist" containerID="9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818568 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a"} err="failed to get container status \"9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a\": rpc error: code = NotFound desc = could not find container \"9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a\": container with ID starting with 9a57353f19b072378c022259ab5cd42c5b1dc1b97fbc286360c406f3e67d341a not found: ID does not exist" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818579 5116 scope.go:117] "RemoveContainer" containerID="8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.818874 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5\": container with ID starting with 8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5 not found: ID does not exist" containerID="8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818932 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5"} err="failed to get container status \"8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5\": rpc error: code = NotFound desc = could not find container \"8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5\": container with ID starting with 8c0f75cea0a92631a5a45b839d9a2769abea056ad176859d98334cc88e86bfd5 not found: ID does not exist" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.818967 5116 scope.go:117] "RemoveContainer" containerID="e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.833992 5116 scope.go:117] "RemoveContainer" containerID="b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.849467 5116 scope.go:117] "RemoveContainer" containerID="95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.884930 5116 scope.go:117] "RemoveContainer" containerID="e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.886202 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5\": container with ID starting with e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5 not found: ID does not exist" containerID="e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.886330 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5"} err="failed to get container status \"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5\": rpc error: code = NotFound desc = could not find container \"e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5\": container with ID starting with e92ff2f3629ac260811a785d8c6f8eb5ed939e4fb38c14eb7bff634a715c68a5 not found: ID does not exist" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.886423 5116 scope.go:117] "RemoveContainer" containerID="b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.888608 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e\": container with ID starting with b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e not found: ID does not exist" containerID="b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.888670 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e"} err="failed to get container status \"b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e\": rpc error: code = NotFound desc = could not find container \"b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e\": container with ID starting with b89e7acd7f87121f1cf68c7619f0385299e7a83f180ceb344d324886b920069e not found: ID does not exist" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.888710 5116 scope.go:117] "RemoveContainer" containerID="95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe" Mar 22 00:11:37 crc kubenswrapper[5116]: E0322 00:11:37.889080 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe\": container with ID starting with 95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe not found: ID does not exist" containerID="95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe" Mar 22 00:11:37 crc kubenswrapper[5116]: I0322 00:11:37.889108 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe"} err="failed to get container status \"95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe\": rpc error: code = NotFound desc = could not find container \"95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe\": container with ID starting with 95e390561446d32b93dd2b4ee1d879e784f5ecc936c45e238f849f636fbfb5fe not found: ID does not exist" Mar 22 00:11:39 crc kubenswrapper[5116]: I0322 00:11:39.706469 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" path="/var/lib/kubelet/pods/29adb7c6-6fa5-4af7-9007-dc22cf4598e7/volumes" Mar 22 00:11:39 crc kubenswrapper[5116]: I0322 00:11:39.709396 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" path="/var/lib/kubelet/pods/77aaaf4b-1fdf-4c49-9e45-86aabb6f007e/volumes" Mar 22 00:11:59 crc kubenswrapper[5116]: I0322 00:11:59.615122 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-66458b6674-8qfhd"] Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.134419 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568972-5s86m"] Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.136949 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="extract-utilities" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137006 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="extract-utilities" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137069 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="extract-utilities" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137087 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="extract-utilities" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137125 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="extract-content" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137142 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="extract-content" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137224 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="extract-content" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137243 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="extract-content" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137260 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137276 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137299 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137318 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137558 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="29adb7c6-6fa5-4af7-9007-dc22cf4598e7" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.137585 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="77aaaf4b-1fdf-4c49-9e45-86aabb6f007e" containerName="registry-server" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.156504 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568972-5s86m"] Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.156657 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.162765 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.162858 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.162772 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.268898 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvfbc\" (UniqueName: \"kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc\") pod \"auto-csr-approver-29568972-5s86m\" (UID: \"907ec022-a4e4-4d33-8329-52c9bbb71520\") " pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.370539 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvfbc\" (UniqueName: \"kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc\") pod \"auto-csr-approver-29568972-5s86m\" (UID: \"907ec022-a4e4-4d33-8329-52c9bbb71520\") " pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.408464 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvfbc\" (UniqueName: \"kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc\") pod \"auto-csr-approver-29568972-5s86m\" (UID: \"907ec022-a4e4-4d33-8329-52c9bbb71520\") " pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.476005 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.851252 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568972-5s86m"] Mar 22 00:12:00 crc kubenswrapper[5116]: I0322 00:12:00.894100 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568972-5s86m" event={"ID":"907ec022-a4e4-4d33-8329-52c9bbb71520","Type":"ContainerStarted","Data":"de25deff2abb4d7c9ae4cfaa6d6a15c6e45609e7a4b4386bae474e620d2b3211"} Mar 22 00:12:04 crc kubenswrapper[5116]: I0322 00:12:04.330727 5116 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-h678z" Mar 22 00:12:04 crc kubenswrapper[5116]: I0322 00:12:04.339132 5116 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-h678z" Mar 22 00:12:04 crc kubenswrapper[5116]: I0322 00:12:04.914641 5116 generic.go:358] "Generic (PLEG): container finished" podID="907ec022-a4e4-4d33-8329-52c9bbb71520" containerID="5f050f299176f9b417ea910e3bb8affec9c6d4bf35a6de76d0aa5ed0d88ddf0f" exitCode=0 Mar 22 00:12:04 crc kubenswrapper[5116]: I0322 00:12:04.914686 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568972-5s86m" event={"ID":"907ec022-a4e4-4d33-8329-52c9bbb71520","Type":"ContainerDied","Data":"5f050f299176f9b417ea910e3bb8affec9c6d4bf35a6de76d0aa5ed0d88ddf0f"} Mar 22 00:12:05 crc kubenswrapper[5116]: I0322 00:12:05.340299 5116 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2026-04-21 00:07:04 +0000 UTC" deadline="2026-04-16 04:03:48.043582248 +0000 UTC" Mar 22 00:12:05 crc kubenswrapper[5116]: I0322 00:12:05.340343 5116 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="603h51m42.703243438s" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.213087 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.291702 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvfbc\" (UniqueName: \"kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc\") pod \"907ec022-a4e4-4d33-8329-52c9bbb71520\" (UID: \"907ec022-a4e4-4d33-8329-52c9bbb71520\") " Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.298522 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc" (OuterVolumeSpecName: "kube-api-access-jvfbc") pod "907ec022-a4e4-4d33-8329-52c9bbb71520" (UID: "907ec022-a4e4-4d33-8329-52c9bbb71520"). InnerVolumeSpecName "kube-api-access-jvfbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.340660 5116 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2026-04-21 00:07:04 +0000 UTC" deadline="2026-04-16 02:01:17.200055935 +0000 UTC" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.341044 5116 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="601h49m10.859016373s" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.393285 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jvfbc\" (UniqueName: \"kubernetes.io/projected/907ec022-a4e4-4d33-8329-52c9bbb71520-kube-api-access-jvfbc\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.928223 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568972-5s86m" event={"ID":"907ec022-a4e4-4d33-8329-52c9bbb71520","Type":"ContainerDied","Data":"de25deff2abb4d7c9ae4cfaa6d6a15c6e45609e7a4b4386bae474e620d2b3211"} Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.928283 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de25deff2abb4d7c9ae4cfaa6d6a15c6e45609e7a4b4386bae474e620d2b3211" Mar 22 00:12:06 crc kubenswrapper[5116]: I0322 00:12:06.928282 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568972-5s86m" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.320885 5116 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.322895 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="907ec022-a4e4-4d33-8329-52c9bbb71520" containerName="oc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.322926 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="907ec022-a4e4-4d33-8329-52c9bbb71520" containerName="oc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.323159 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="907ec022-a4e4-4d33-8329-52c9bbb71520" containerName="oc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.327896 5116 kubelet.go:2547] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.327948 5116 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.328194 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329195 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" containerID="cri-o://ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7" gracePeriod=15 Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329248 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb" gracePeriod=15 Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329370 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b" gracePeriod=15 Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329191 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456" gracePeriod=15 Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329210 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" containerID="cri-o://dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f" gracePeriod=15 Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329609 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329649 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329679 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329696 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329776 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329809 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329833 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329848 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329869 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329885 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329912 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329929 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329959 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="setup" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329974 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="setup" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.329992 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330007 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330033 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330047 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330304 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330333 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330351 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-regeneration-controller" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330376 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330397 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330416 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-insecure-readyz" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330433 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-cert-syncer" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330701 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330724 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.330985 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.331496 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a14caf222afb62aaabdc47808b6f944" containerName="kube-apiserver-check-endpoints" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.336850 5116 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="3a14caf222afb62aaabdc47808b6f944" podUID="57755cc5f99000cc11e193051474d4e2" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.350431 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.369084 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406191 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406257 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406414 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406464 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406528 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406590 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406658 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406681 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406699 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.406713 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.507954 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508010 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508041 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508078 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508153 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508155 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508199 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508155 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508237 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508288 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508282 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508329 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508351 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508355 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508449 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508501 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508559 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/57755cc5f99000cc11e193051474d4e2-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508590 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508771 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-tmp-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.508916 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/57755cc5f99000cc11e193051474d4e2-ca-bundle-dir\") pod \"kube-apiserver-crc\" (UID: \"57755cc5f99000cc11e193051474d4e2\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: I0322 00:12:14.666370 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:12:14 crc kubenswrapper[5116]: E0322 00:12:14.693977 5116 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189f016a65c5b829 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f7dbc7e1ee9c187a863ef9b473fad27b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:12:14.692882473 +0000 UTC m=+205.715183896,LastTimestamp:2026-03-22 00:12:14.692882473 +0000 UTC m=+205.715183896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.016849 5116 generic.go:358] "Generic (PLEG): container finished" podID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" containerID="1d4ffcdcf7f3c1ceaea89d564eaa99a17f05170fe4b5407ce1655d672a0e5a4e" exitCode=0 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.016953 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29568960-tjk88" event={"ID":"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58","Type":"ContainerDied","Data":"1d4ffcdcf7f3c1ceaea89d564eaa99a17f05170fe4b5407ce1655d672a0e5a4e"} Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.018028 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.018328 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.019569 5116 generic.go:358] "Generic (PLEG): container finished" podID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" containerID="ee145f68231146608459069e1444e4a54019fcaddb68827cf1a4f7b9bb7e9250" exitCode=0 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.019663 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"d04f6d8c-7814-4e1f-8000-afd2938eb5db","Type":"ContainerDied","Data":"ee145f68231146608459069e1444e4a54019fcaddb68827cf1a4f7b9bb7e9250"} Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.020052 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.020277 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.020542 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.022802 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-check-endpoints/3.log" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.024266 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.025260 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f" exitCode=0 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.025290 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456" exitCode=0 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.025299 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb" exitCode=0 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.025307 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b" exitCode=2 Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.025357 5116 scope.go:117] "RemoveContainer" containerID="4ec1f0e4053fa1e136a94ad86e588cc0fd43b29333734b120fe3d6175c1913a8" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.027010 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f7dbc7e1ee9c187a863ef9b473fad27b","Type":"ContainerStarted","Data":"5cb08176408e0f1bc306dc6b2afd3ce9b0c721a85dbb42d8ee9b826aa3eecff5"} Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.027034 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f7dbc7e1ee9c187a863ef9b473fad27b","Type":"ContainerStarted","Data":"e89942e237048983deb4673cd81f4ae670d784cef4073c8536922b7c8b0579fa"} Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.027600 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.027997 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:15 crc kubenswrapper[5116]: I0322 00:12:15.028347 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.039290 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.359947 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.360951 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.361257 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.361510 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.365598 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.366065 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.366329 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.366704 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.434807 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock\") pod \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.434900 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access\") pod \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.434986 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca\") pod \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.435098 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir\") pod \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\" (UID: \"d04f6d8c-7814-4e1f-8000-afd2938eb5db\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.435215 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9lrd\" (UniqueName: \"kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd\") pod \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\" (UID: \"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.438439 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock" (OuterVolumeSpecName: "var-lock") pod "d04f6d8c-7814-4e1f-8000-afd2938eb5db" (UID: "d04f6d8c-7814-4e1f-8000-afd2938eb5db"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.438550 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d04f6d8c-7814-4e1f-8000-afd2938eb5db" (UID: "d04f6d8c-7814-4e1f-8000-afd2938eb5db"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.439248 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca" (OuterVolumeSpecName: "serviceca") pod "f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" (UID: "f8b3ade2-2521-43a2-a5fc-2c33d19f3a58"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.442634 5116 reconciler_common.go:299] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-var-lock\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.442699 5116 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-serviceca\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.442728 5116 reconciler_common.go:299] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.460440 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d04f6d8c-7814-4e1f-8000-afd2938eb5db" (UID: "d04f6d8c-7814-4e1f-8000-afd2938eb5db"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.462369 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd" (OuterVolumeSpecName: "kube-api-access-b9lrd") pod "f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" (UID: "f8b3ade2-2521-43a2-a5fc-2c33d19f3a58"). InnerVolumeSpecName "kube-api-access-b9lrd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.543502 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b9lrd\" (UniqueName: \"kubernetes.io/projected/f8b3ade2-2521-43a2-a5fc-2c33d19f3a58-kube-api-access-b9lrd\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.543543 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04f6d8c-7814-4e1f-8000-afd2938eb5db-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.727293 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.727879 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.728424 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.728645 5116 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.728920 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.729330 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847298 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847405 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847470 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847521 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847559 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") pod \"3a14caf222afb62aaabdc47808b6f944\" (UID: \"3a14caf222afb62aaabdc47808b6f944\") " Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847619 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847738 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847950 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir" (OuterVolumeSpecName: "ca-bundle-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "ca-bundle-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.847600 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.848133 5116 reconciler_common.go:299] "Volume detached for volume \"ca-bundle-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-ca-bundle-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.848146 5116 reconciler_common.go:299] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.848155 5116 reconciler_common.go:299] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.848183 5116 reconciler_common.go:299] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3a14caf222afb62aaabdc47808b6f944-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.850634 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "3a14caf222afb62aaabdc47808b6f944" (UID: "3a14caf222afb62aaabdc47808b6f944"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:12:16 crc kubenswrapper[5116]: I0322 00:12:16.949382 5116 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a14caf222afb62aaabdc47808b6f944-tmp-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.047822 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29568960-tjk88" event={"ID":"f8b3ade2-2521-43a2-a5fc-2c33d19f3a58","Type":"ContainerDied","Data":"82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c"} Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.047871 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82ad6b215452bb6301a7576bc4e5f0e0d906844913073e724aa03d477079324c" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.047988 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29568960-tjk88" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.051580 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-12-crc" event={"ID":"d04f6d8c-7814-4e1f-8000-afd2938eb5db","Type":"ContainerDied","Data":"6fa767d4004b7071ab450a3ff465069b85b54dadf7bad861245cb4344509496d"} Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.051643 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa767d4004b7071ab450a3ff465069b85b54dadf7bad861245cb4344509496d" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.051649 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-12-crc" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.054141 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_3a14caf222afb62aaabdc47808b6f944/kube-apiserver-cert-syncer/0.log" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.055017 5116 generic.go:358] "Generic (PLEG): container finished" podID="3a14caf222afb62aaabdc47808b6f944" containerID="ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7" exitCode=0 Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.055118 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.055124 5116 scope.go:117] "RemoveContainer" containerID="dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.064747 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.065144 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.065594 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.065869 5116 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.068869 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.069762 5116 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.070073 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.070391 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.070854 5116 scope.go:117] "RemoveContainer" containerID="b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.072338 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.072661 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.073024 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.073311 5116 status_manager.go:895] "Failed to get status for pod" podUID="3a14caf222afb62aaabdc47808b6f944" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.089354 5116 scope.go:117] "RemoveContainer" containerID="8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.108961 5116 scope.go:117] "RemoveContainer" containerID="d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.123533 5116 scope.go:117] "RemoveContainer" containerID="ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.145086 5116 scope.go:117] "RemoveContainer" containerID="15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.206928 5116 scope.go:117] "RemoveContainer" containerID="dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.207831 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f\": container with ID starting with dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f not found: ID does not exist" containerID="dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.207875 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f"} err="failed to get container status \"dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f\": rpc error: code = NotFound desc = could not find container \"dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f\": container with ID starting with dcfc5090822842b425def56d8dcf3225bb8000eb1ec79e8329dcadd2f3879a0f not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.207901 5116 scope.go:117] "RemoveContainer" containerID="b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.208243 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\": container with ID starting with b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456 not found: ID does not exist" containerID="b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.208266 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456"} err="failed to get container status \"b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\": rpc error: code = NotFound desc = could not find container \"b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456\": container with ID starting with b24be2af5c7a78bb1d324b802332ff3620ee459e00164b6574221c5186689456 not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.208283 5116 scope.go:117] "RemoveContainer" containerID="8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.208791 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\": container with ID starting with 8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb not found: ID does not exist" containerID="8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.208838 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb"} err="failed to get container status \"8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\": rpc error: code = NotFound desc = could not find container \"8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb\": container with ID starting with 8b5eb8790cbf9c748b6973a9f5ce75637e41f035ed3bdd5eda970498f8d57bdb not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.208867 5116 scope.go:117] "RemoveContainer" containerID="d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.209120 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\": container with ID starting with d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b not found: ID does not exist" containerID="d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.209160 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b"} err="failed to get container status \"d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\": rpc error: code = NotFound desc = could not find container \"d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b\": container with ID starting with d4b3378494f7debf7a1df1eb70ba9f5b687fd897b206a12e1eb127da23e5830b not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.209208 5116 scope.go:117] "RemoveContainer" containerID="ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.209494 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\": container with ID starting with ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7 not found: ID does not exist" containerID="ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.209520 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7"} err="failed to get container status \"ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\": rpc error: code = NotFound desc = could not find container \"ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7\": container with ID starting with ad6827aa53ec071d573f2851cceb4fb83950ed3acf710fdb8da0db4f5143d5e7 not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.209536 5116 scope.go:117] "RemoveContainer" containerID="15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5" Mar 22 00:12:17 crc kubenswrapper[5116]: E0322 00:12:17.209794 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\": container with ID starting with 15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5 not found: ID does not exist" containerID="15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.209874 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5"} err="failed to get container status \"15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\": rpc error: code = NotFound desc = could not find container \"15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5\": container with ID starting with 15a3fa2ea4a685791c1b819448d6d0952ea97b7caf1a51b4250746e17e743cc5 not found: ID does not exist" Mar 22 00:12:17 crc kubenswrapper[5116]: I0322 00:12:17.707402 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a14caf222afb62aaabdc47808b6f944" path="/var/lib/kubelet/pods/3a14caf222afb62aaabdc47808b6f944/volumes" Mar 22 00:12:19 crc kubenswrapper[5116]: I0322 00:12:19.703950 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:19 crc kubenswrapper[5116]: I0322 00:12:19.705813 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:19 crc kubenswrapper[5116]: I0322 00:12:19.706574 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:21 crc kubenswrapper[5116]: E0322 00:12:21.762520 5116 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189f016a65c5b829 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f7dbc7e1ee9c187a863ef9b473fad27b,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-22 00:12:14.692882473 +0000 UTC m=+205.715183896,LastTimestamp:2026-03-22 00:12:14.692882473 +0000 UTC m=+205.715183896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 22 00:12:23 crc kubenswrapper[5116]: I0322 00:12:23.056780 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:12:23 crc kubenswrapper[5116]: I0322 00:12:23.056855 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.497823 5116 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.498261 5116 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.498750 5116 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.499095 5116 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.499475 5116 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:23 crc kubenswrapper[5116]: I0322 00:12:23.499512 5116 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.499809 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 22 00:12:23 crc kubenswrapper[5116]: E0322 00:12:23.700503 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 22 00:12:24 crc kubenswrapper[5116]: E0322 00:12:24.101534 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 22 00:12:24 crc kubenswrapper[5116]: I0322 00:12:24.644396 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" containerID="cri-o://0eaa89fec503d9ea89bbf6737645ec15fbfea7b3c5aaafe399fe76d10fe522f5" gracePeriod=15 Mar 22 00:12:24 crc kubenswrapper[5116]: E0322 00:12:24.903477 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.107745 5116 generic.go:358] "Generic (PLEG): container finished" podID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerID="0eaa89fec503d9ea89bbf6737645ec15fbfea7b3c5aaafe399fe76d10fe522f5" exitCode=0 Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.107860 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" event={"ID":"73ebea9b-fc7b-4d54-af53-f6f61e0fce97","Type":"ContainerDied","Data":"0eaa89fec503d9ea89bbf6737645ec15fbfea7b3c5aaafe399fe76d10fe522f5"} Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.108364 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" event={"ID":"73ebea9b-fc7b-4d54-af53-f6f61e0fce97","Type":"ContainerDied","Data":"541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb"} Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.108386 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="541bc11e49e0b922c8da02fa30a7c9193a344bc13bc63367f3137d8c058690fb" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.129397 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.129960 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.130563 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.130827 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.131105 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.256971 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257063 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257115 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257197 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qljht\" (UniqueName: \"kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257231 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257312 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257357 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257392 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257457 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257542 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257584 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257627 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257667 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.257717 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig\") pod \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\" (UID: \"73ebea9b-fc7b-4d54-af53-f6f61e0fce97\") " Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.258361 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.258807 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.259075 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.259311 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.259609 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.264765 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.265202 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.265520 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.268514 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht" (OuterVolumeSpecName: "kube-api-access-qljht") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "kube-api-access-qljht". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.268596 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.268767 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.269183 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.269599 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.270901 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "73ebea9b-fc7b-4d54-af53-f6f61e0fce97" (UID: "73ebea9b-fc7b-4d54-af53-f6f61e0fce97"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.359846 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.359969 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360032 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qljht\" (UniqueName: \"kubernetes.io/projected/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-kube-api-access-qljht\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360052 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360070 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360089 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360106 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360129 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360158 5116 reconciler_common.go:299] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360216 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360299 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360330 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360355 5116 reconciler_common.go:299] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:25 crc kubenswrapper[5116]: I0322 00:12:25.360373 5116 reconciler_common.go:299] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/73ebea9b-fc7b-4d54-af53-f6f61e0fce97-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.116631 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.118042 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.118632 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.119115 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.119965 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.122015 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.122441 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.122771 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: I0322 00:12:26.123127 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:26 crc kubenswrapper[5116]: E0322 00:12:26.504789 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.133737 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.134077 5116 generic.go:358] "Generic (PLEG): container finished" podID="9f0bc7fcb0822a2c13eb2d22cd8c0641" containerID="05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54" exitCode=1 Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.134233 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerDied","Data":"05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54"} Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.135967 5116 status_manager.go:895] "Failed to get status for pod" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.136011 5116 scope.go:117] "RemoveContainer" containerID="05d6ab41bff1491b16d13fd321fc3f2ed79784b5945966cfa37735f570281b54" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.136538 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.137305 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.137845 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:28 crc kubenswrapper[5116]: I0322 00:12:28.138287 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.148727 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.148901 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"9f0bc7fcb0822a2c13eb2d22cd8c0641","Type":"ContainerStarted","Data":"dad18ac6eecf9b080f9abca1e33533420e48da065dc1002c74302a487f52aacf"} Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.150260 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.150881 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.151420 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.151982 5116 status_manager.go:895] "Failed to get status for pod" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.152739 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.705219 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.705501 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.706305 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: E0322 00:12:29.706501 5116 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="6.4s" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.706758 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.707310 5116 status_manager.go:895] "Failed to get status for pod" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.707936 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.708253 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.708595 5116 status_manager.go:895] "Failed to get status for pod" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.709092 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.709630 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: E0322 00:12:29.709753 5116 desired_state_of_world_populator.go:305] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" volumeName="registry-storage" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.710122 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.730805 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.730840 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:29 crc kubenswrapper[5116]: E0322 00:12:29.731402 5116 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:29 crc kubenswrapper[5116]: I0322 00:12:29.732594 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.173753 5116 generic.go:358] "Generic (PLEG): container finished" podID="57755cc5f99000cc11e193051474d4e2" containerID="b32f7d41ffa88138d0582a1450f0b71884223fd0ff939e59c91365f086a779e1" exitCode=0 Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.173888 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerDied","Data":"b32f7d41ffa88138d0582a1450f0b71884223fd0ff939e59c91365f086a779e1"} Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.174131 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"d1bc653fcd32c7727edd873aec58d794e017d63a5b2b4ef85fd0d5342b71f652"} Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.174818 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.174855 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.175571 5116 status_manager.go:895] "Failed to get status for pod" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" pod="openshift-kube-apiserver/installer-12-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-12-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.175582 5116 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.176031 5116 status_manager.go:895] "Failed to get status for pod" podUID="9f0bc7fcb0822a2c13eb2d22cd8c0641" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.176531 5116 status_manager.go:895] "Failed to get status for pod" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.176901 5116 status_manager.go:895] "Failed to get status for pod" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" pod="openshift-authentication/oauth-openshift-66458b6674-8qfhd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-66458b6674-8qfhd\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: I0322 00:12:30.177477 5116 status_manager.go:895] "Failed to get status for pod" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" pod="openshift-image-registry/image-pruner-29568960-tjk88" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/pods/image-pruner-29568960-tjk88\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.447038 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:12:30Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:12:30Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:12:30Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-22T00:12:30Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c8a088031661d94022418e93fb63744c38e1c4cff93ea3b95c096a290c2b7a3\\\"],\\\"sizeBytes\\\":2981840865},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d349aa1ac1aeef5b43252e81842c9fdae910d60790027b9e872b5662ac4b78a3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:fe773df04da144155e88592785e3f887dd2518380c36be5aa0d5f3c4b407346b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1738635128},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174629230f874ae7d9ceda909ef45aced0cc8b21537851a0aceca55b0685b122\\\"],\\\"sizeBytes\\\":1641503854},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:286bb0beab328954b0a86b7f066fd5a843b462d6acb2812df7ec788015cd32d4\\\"],\\\"sizeBytes\\\":1597684406},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:97ea275ea42f6b1f7f4041dea668f5ec2ba23ebc254008d276dc9589f8fa2899\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:a48320aa1bff5d245dea9a04c9c7f038669d74be18d254ab16f1b1071c6efe02\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1278267500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:85f1323d589d7af13b096b1f9b438b9dfe08f3fab37534e2780e6490a665bf05\\\"],\\\"sizeBytes\\\":1261384762},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:743309461ecf5761903ae65e667afffe78054c5286dd469edd6aebdbc1266545\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:8334109ecba91dfe3e738d0b6bb46e4263ca25d0f5a99832a61371f9ea33f2fa\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.20\\\"],\\\"sizeBytes\\\":1239891053},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8d607fb6cc75ca36bca1e0a9c5bea5d1919b75db20733df69c64c8a10ee8083d\\\"],\\\"sizeBytes\\\":1224304325},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:2b25b4ab3e224e729bcb897a9d8b4500cb8cdf41dc4e39241fca36503dd7a6e6\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:7010a1d34012ae242b0950c830b00b3a9907b1dc17951db92c5e0d4a06d6d3a1\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.20\\\"],\\\"sizeBytes\\\":1183656546},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a4c5df55584cba56f00004a090923a5c6de2071add5eb1672a5e20aa646aad8c\\\"],\\\"sizeBytes\\\":1126957757},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:df08951924aa23b2333436a1d04b2dba56c366bb4f09d39ae3aedb980e4fb909\\\"],\\\"sizeBytes\\\":1079537324},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9414357f9345a841e0565265700ecc6637f846c83bd5908dbb7b306432465115\\\"],\\\"sizeBytes\\\":1052707833},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8d1a1e4abe0326c3af89e9eaa4b7449dd2d5b6f9403c677e19b00b24947b1df9\\\"],\\\"sizeBytes\\\":989392005},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b2b1fc3d5bb4944cbd5b23b87566d7ba24b1b66f5a0465f76bcc05023191cc47\\\"],\\\"sizeBytes\\\":971668163},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:be2edaed22535093bdb486afe5960ff4f3b0bd96f88dc1753b584cc28184a0b0\\\"],\\\"sizeBytes\\\":969078739},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3d6c8802ae53d6aecf38aa7b560d7892193806bdeb3d7c1637fac77c47fd1f\\\"],\\\"sizeBytes\\\":876488654},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8a46fa8feeea5d04fd602559027f8bacc97e12bbf8e33793dca08e812e1f8825\\\"],\\\"sizeBytes\\\":847332502},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:36c4867005702f0c4cbfcfa33f18a98596a6c9b1340b633c85ccef84a0c4f889\\\"],\\\"sizeBytes\\\":769516783},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b55c029f731ebbde3c5580eef98a588264f4d6a8ae667805c9521dd1ecf1d5d\\\"],\\\"sizeBytes\\\":721591926},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:bf05b9b2ba66351a6c59f4259fb377f62237a00af3b4f0b95f64409e2f25770e\\\"],\\\"sizeBytes\\\":646867625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a8581a82ba5c8343a743aa302c4848249d8c32a9f2cd10fa68d89d835a1bdf8b\\\"],\\\"sizeBytes\\\":638910445},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae245c97fc463e876c3024efb806fa8f4efb13b3f06f1bdd3e7e1447f5a5dce4\\\"],\\\"sizeBytes\\\":617699779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d4926e304011637ca9df370a193896d685f0f3ffabbec234ec827abdbeb083f9\\\"],\\\"sizeBytes\\\":607756695},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5c5d7468f6838b6a714482e62ea956659212f3415ec8f69989f75eb6d8744a6e\\\"],\\\"sizeBytes\\\":584721741},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:68c07ee2fb6450c7b3b35bfdfc158dc475aaa0bcf9fba28b5e310d7e03355c04\\\"],\\\"sizeBytes\\\":545674969},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:574d49b89604b8e8103abf57feee77812fe8cf441eafc17fdff95d57ca80645e\\\"],\\\"sizeBytes\\\":542463064},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:f69b9cc9b9cfde726109a9e12b80a3eefa472d7e29159df0fbc7143c48983cd6\\\"],\\\"sizeBytes\\\":539380592},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9506bdcf97d5200cf2cf4cdf110aebafdd141a24f6589bf1e1cfe27bb7fc1ed2\\\"],\\\"sizeBytes\\\":533027808},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9e388ee2b3562b6267447cbcc4b95ca7a61bf361840d36a682480da671b83612\\\"],\\\"sizeBytes\\\":528200501},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5a2a7b3c2f1598189d8880e6aa15ab11a65b201f25012f77ba41e7487a60729a\\\"],\\\"sizeBytes\\\":527774342},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e5e8108294b086fdb797365e5a46badba9b3d866bdcddc8460a51e05a253753d\\\"],\\\"sizeBytes\\\":526632426},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5827f6ae3beb4853192e02cc18890467bd251b33070f36f9a105991e7e6d3c9b\\\"],\\\"sizeBytes\\\":522490210},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:66c8fe5d45ff249643dae75185dd2787ea1b0ae87d5699a8222149c07689557c\\\"],\\\"sizeBytes\\\":520141094},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:baf975b6944f2844860c440636e0d4b80b2fdc473d30f32ae7d6989f2fc2b135\\\"],\\\"sizeBytes\\\":519815758},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:765cf9c3ebf4df049ebc022beaaf52f52852cf89fb802034536ad91dd45db807\\\"],\\\"sizeBytes\\\":519539350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:52e442bc8198ac925caff87ddd35b3107b7375d5afc9c2eb041ca4e79db72c6f\\\"],\\\"sizeBytes\\\":518690683},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:43b0e0b7e1955ee905e48799a62f50b8a8df553190415ce1f5550375c2507ca5\\\"],\\\"sizeBytes\\\":518251952},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:977a316fa3598eb575a4477dafc09bbf06fad21c4ec2867052225d74f2a9f366\\\"],\\\"sizeBytes\\\":511136541},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e504172345491d90bbbf1e7e45488e73073f4c6d7c2355245871051596fc85db\\\"],\\\"sizeBytes\\\":510122097},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dbd8603d717c26901bcf9731b1e0392ae4bc08a270ed1eeb45839e44bed9607d\\\"],\\\"sizeBytes\\\":508941917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7c6a47106effd9e9a41131e2bf6c832b80cd77b3439334f760b35b0729f2fb00\\\"],\\\"sizeBytes\\\":508318343},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7a726c68cebc9b08edd734a8bae5150ae5950f7734fe9b9c2a6e0d06f21cc095\\\"],\\\"sizeBytes\\\":498380948},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:82501261b9c63012ba3b83fe4d6703c0af5eb9c9151670eb90ae480b9507d761\\\"],\\\"sizeBytes\\\":497232440},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:4e4239621caed0b0d9132d167403631e9af86be9a395977f013e201ead281bb4\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:c0b1bec73fdb6853eb3bd9e9733aee2d760ca09a33cfd94adf9ab7b706e83fa9\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":491224335},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b0f7abf2f97afd1127d9245d764338c6047bac1711b2cee43112570a85946360\\\"],\\\"sizeBytes\\\":490381192},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:21b12ff0c81c1d535e7c31aff3a73b1e9ca763e5f88037f59ade0dfab6ed8946\\\"],\\\"sizeBytes\\\":482632652},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:036ed6efe4cb5f5b90ee7f9ef5297c8591b8d67aa36b3c58b4fc5417622a140c\\\"],\\\"sizeBytes\\\":477561861},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0fe5a041a2b99d736e82f1b4a6cd9792c5e23ded475e9f0742cd19234070f989\\\"],\\\"sizeBytes\\\":475327956},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dcb03ccba25366bbdf74cbab6738e7ef1f97f62760886ec445a40cdf29b60418\\\"],\\\"sizeBytes\\\":475137830}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.447577 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.447891 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.448212 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.448585 5116 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 22 00:12:30 crc kubenswrapper[5116]: E0322 00:12:30.448616 5116 kubelet_node_status.go:584] "Unable to update node status" err="update node status exceeds retry count" Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.183090 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"c051c9ab63505541bf23433f5ec8f77a814dab2ddf493328097b48d9e97dcdf0"} Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.183142 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"130990b4dc827738704ae59ea639bd7b0fa3f03691dcc3c370a1fa1d424322f9"} Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.183159 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"80c741d698dd97e3f8f7308d04234de45eac4bdc3ba94d035f805eac0180fc33"} Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.183185 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"212ceaec8f8d88c0485faa66cfe2cba85614269d326bff6250fec302122a2e21"} Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.704309 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.709821 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:12:31 crc kubenswrapper[5116]: I0322 00:12:31.788972 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:12:32 crc kubenswrapper[5116]: I0322 00:12:32.191298 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"57755cc5f99000cc11e193051474d4e2","Type":"ContainerStarted","Data":"44ab5aef330a7ac5d594e79eb92f23b9cb2e8c72a76793adbf837829a8ec347e"} Mar 22 00:12:32 crc kubenswrapper[5116]: I0322 00:12:32.191482 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:32 crc kubenswrapper[5116]: I0322 00:12:32.191638 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:32 crc kubenswrapper[5116]: I0322 00:12:32.191670 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:34 crc kubenswrapper[5116]: I0322 00:12:34.732757 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:34 crc kubenswrapper[5116]: I0322 00:12:34.733058 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:34 crc kubenswrapper[5116]: I0322 00:12:34.738365 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:37 crc kubenswrapper[5116]: I0322 00:12:37.413667 5116 kubelet.go:3329] "Deleted mirror pod as it didn't match the static Pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:37 crc kubenswrapper[5116]: I0322 00:12:37.414110 5116 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:38 crc kubenswrapper[5116]: I0322 00:12:38.232548 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:38 crc kubenswrapper[5116]: I0322 00:12:38.232581 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:38 crc kubenswrapper[5116]: I0322 00:12:38.237858 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:39 crc kubenswrapper[5116]: I0322 00:12:39.238240 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:39 crc kubenswrapper[5116]: I0322 00:12:39.238269 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:39 crc kubenswrapper[5116]: I0322 00:12:39.743614 5116 status_manager.go:905] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="57755cc5f99000cc11e193051474d4e2" podUID="896372db-5d6a-4ee9-a4a2-ad50e6a194a1" Mar 22 00:12:43 crc kubenswrapper[5116]: I0322 00:12:43.201868 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 22 00:12:47 crc kubenswrapper[5116]: I0322 00:12:47.989342 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 22 00:12:48 crc kubenswrapper[5116]: I0322 00:12:48.132398 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 22 00:12:48 crc kubenswrapper[5116]: I0322 00:12:48.254907 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-dockercfg-tnfx9\"" Mar 22 00:12:48 crc kubenswrapper[5116]: I0322 00:12:48.665746 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Mar 22 00:12:48 crc kubenswrapper[5116]: I0322 00:12:48.667634 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.071720 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-metrics\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.108780 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mcc-proxy-tls\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.150327 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.188364 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.357874 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.443039 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nwglk\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.446331 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-l2v2m\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.516077 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.634186 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"openshift-service-ca.crt\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.766598 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6w67b\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.767099 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.845023 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 22 00:12:49 crc kubenswrapper[5116]: I0322 00:12:49.941232 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"node-bootstrapper-token\"" Mar 22 00:12:50 crc kubenswrapper[5116]: I0322 00:12:50.083119 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 22 00:12:50 crc kubenswrapper[5116]: I0322 00:12:50.205069 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:50 crc kubenswrapper[5116]: I0322 00:12:50.250674 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-dockercfg-bf7fj\"" Mar 22 00:12:50 crc kubenswrapper[5116]: I0322 00:12:50.475468 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"openshift-service-ca.crt\"" Mar 22 00:12:50 crc kubenswrapper[5116]: I0322 00:12:50.980316 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-dockercfg-sw6nc\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.043510 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.045877 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-images\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.220865 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.380631 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"image-import-ca\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.452325 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.493531 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.587103 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tjs74\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.607468 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kknhg\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.697203 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-config\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.697658 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.753621 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.758261 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-node-identity\"/\"network-node-identity-cert\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.790740 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t8n29\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.811623 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"marketplace-trusted-ca\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.820130 5116 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.836046 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=37.836011553 podStartE2EDuration="37.836011553s" podCreationTimestamp="2026-03-22 00:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:12:37.561398404 +0000 UTC m=+228.583699787" watchObservedRunningTime="2026-03-22 00:12:51.836011553 +0000 UTC m=+242.858312956" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.840295 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-66458b6674-8qfhd"] Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.840662 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-6b75ff674b-bdglf"] Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841254 5116 kubelet.go:3323] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841297 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="80bbc05d-2ba5-48f3-8d94-3fcd0c0f12d9" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841785 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" containerName="image-pruner" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841877 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" containerName="image-pruner" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841936 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.841996 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.842101 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" containerName="installer" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.842189 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" containerName="installer" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.842417 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8b3ade2-2521-43a2-a5fc-2c33d19f3a58" containerName="image-pruner" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.842553 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="d04f6d8c-7814-4e1f-8000-afd2938eb5db" containerName="installer" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.842634 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" containerName="oauth-openshift" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.851778 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.852279 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.854623 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.854842 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855068 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-session\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855255 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855498 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"openshift-service-ca.crt\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855639 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855808 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-d2bf2\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.855959 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.856762 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.856891 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"audit\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.857035 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.857455 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"kube-root-ca.crt\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.862649 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.873570 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.876472 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.876449396 podStartE2EDuration="14.876449396s" podCreationTimestamp="2026-03-22 00:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:12:51.872979332 +0000 UTC m=+242.895280715" watchObservedRunningTime="2026-03-22 00:12:51.876449396 +0000 UTC m=+242.898750769" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.876842 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.878920 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9pgs7\"" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910103 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910160 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910201 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910371 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-dir\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910396 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-policies\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910419 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-session\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910444 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910485 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910509 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910534 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz55q\" (UniqueName: \"kubernetes.io/projected/58f868be-d7d6-4e45-96b8-49fb29023df0-kube-api-access-vz55q\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910557 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-login\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910582 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910597 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.910619 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-error\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:51 crc kubenswrapper[5116]: I0322 00:12:51.996919 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011515 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011586 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vz55q\" (UniqueName: \"kubernetes.io/projected/58f868be-d7d6-4e45-96b8-49fb29023df0-kube-api-access-vz55q\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011622 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-login\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011647 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011665 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011697 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-error\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011718 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011745 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011760 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011790 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-dir\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011808 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-policies\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011829 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-session\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011852 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.011890 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.012370 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-dir\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.013415 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-audit-policies\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.013994 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.014227 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.014402 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.018787 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-error\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.018839 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-session\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.018849 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.019299 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.019555 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.020492 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-login\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.023060 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.023084 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/58f868be-d7d6-4e45-96b8-49fb29023df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.039790 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz55q\" (UniqueName: \"kubernetes.io/projected/58f868be-d7d6-4e45-96b8-49fb29023df0-kube-api-access-vz55q\") pod \"oauth-openshift-6b75ff674b-bdglf\" (UID: \"58f868be-d7d6-4e45-96b8-49fb29023df0\") " pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.096759 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-global-ca\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.137842 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"proxy-tls\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.172114 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"service-ca-bundle\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.178178 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.233004 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.254046 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"etcd-serving-ca\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.273158 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8dkm8\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.373486 5116 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.396297 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.502183 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-bgxvm\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.517266 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"audit-1\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.573501 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-root-ca.crt\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.599857 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.622731 5116 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.675710 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.711080 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-client\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.744398 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.748147 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"trusted-ca\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.816809 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"kube-root-ca.crt\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.877326 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.877430 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.891439 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-wzhvk\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.893489 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-config\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.924521 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-config\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.933260 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 22 00:12:52 crc kubenswrapper[5116]: I0322 00:12:52.959032 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.006630 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.057274 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.057353 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.082970 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"config\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.135797 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.302388 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kl6m8\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.374549 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.379898 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.438986 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.483386 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"ingress-operator-dockercfg-74nwh\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.653479 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.686184 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.708102 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ebea9b-fc7b-4d54-af53-f6f61e0fce97" path="/var/lib/kubelet/pods/73ebea9b-fc7b-4d54-af53-f6f61e0fce97/volumes" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.708826 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.724995 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-6tbpn\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.760751 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.767622 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.804285 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.890534 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-client\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.890542 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:53 crc kubenswrapper[5116]: I0322 00:12:53.891906 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.005635 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.011313 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"cluster-version-operator-serving-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.013667 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"catalog-operator-serving-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.123308 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kw8fx\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.166144 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.201767 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"ovnkube-identity-cm\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.212581 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.220454 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.256230 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.288002 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"packageserver-service-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.378693 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.380493 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-tls\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.453064 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-dockercfg-4vdnc\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.528969 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serving-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.569092 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-rbac-proxy\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.773768 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.869329 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"cluster-image-registry-operator-dockercfg-ntnd7\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.875740 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-etcd-operator\"/\"etcd-operator-serving-cert\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.998017 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"openshift-config-operator-dockercfg-sjn6s\"" Mar 22 00:12:54 crc kubenswrapper[5116]: I0322 00:12:54.998742 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.038552 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mdwwj\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.045644 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.053071 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.171887 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ac-dockercfg-gj7jx\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.186149 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2h6bs\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.203267 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.236671 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"client-ca\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.295552 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kpvmz\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.322947 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-mmcpt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.389632 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.441592 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-operator\"/\"metrics-tls\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.495711 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.541957 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.588795 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.736574 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.776508 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.781645 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-config\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.834106 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"dns-operator-dockercfg-wbbsn\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.901969 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"kube-rbac-proxy\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.929837 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.948255 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-dockercfg-gnx66\"" Mar 22 00:12:55 crc kubenswrapper[5116]: I0322 00:12:55.954483 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"hostpath-provisioner\"/\"csi-hostpath-provisioner-sa-dockercfg-7dcws\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.140163 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-version\"/\"default-dockercfg-hqpm5\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.267448 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.379356 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"serving-cert\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.392838 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns-operator\"/\"metrics-tls\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.394905 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.441672 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.494822 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-nl8tp\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.543021 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"config\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.556758 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.560057 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"openshift-service-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.598087 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"etcd-client\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.599878 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.614908 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-certs-default\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.659534 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.686847 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.850230 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.905355 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.930885 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.945996 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\"" Mar 22 00:12:56 crc kubenswrapper[5116]: I0322 00:12:56.956285 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.001593 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-4zqgh\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.008736 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.039128 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-node-identity\"/\"env-overrides\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.113699 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-tls\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.137696 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-config-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.180043 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.181779 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.184343 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.185621 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.321772 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.451196 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-serving-cert\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.484139 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.593213 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.639027 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-serving-cert\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.658855 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.778654 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jmhxf\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.827376 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"audit-1\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.834845 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.856473 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:57 crc kubenswrapper[5116]: I0322 00:12:57.972300 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.118822 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.228729 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.393290 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.482672 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager-operator\"/\"kube-root-ca.crt\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.483948 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"serving-cert\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.578911 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.582036 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.786830 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-operator-images\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.863162 5116 kubelet.go:2547] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.863760 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" containerID="cri-o://5cb08176408e0f1bc306dc6b2afd3ce9b0c721a85dbb42d8ee9b826aa3eecff5" gracePeriod=5 Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.878454 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-apiserver-operator\"/\"kube-apiserver-operator-config\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.886923 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"trusted-ca-bundle\"" Mar 22 00:12:58 crc kubenswrapper[5116]: I0322 00:12:58.938656 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.032865 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-ca-bundle\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.169519 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"marketplace-operator-dockercfg-2cfkp\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.213792 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.241643 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"metrics-tls\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.266094 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"serving-cert\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.324848 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-config\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.392523 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.445398 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.557819 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.565016 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g6kgg\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.600294 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-server-dockercfg-dzw6b\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.615548 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.618179 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.624727 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"hostpath-provisioner\"/\"kube-root-ca.crt\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.679240 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"etcd-service-ca-bundle\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.714128 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-route-controller-manager\"/\"serving-cert\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.943342 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-admission-controller-secret\"" Mar 22 00:12:59 crc kubenswrapper[5116]: I0322 00:12:59.950235 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-djmfg\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.007896 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-6n5ln\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.031589 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.117892 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-api\"/\"control-plane-machine-set-operator-tls\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.510012 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-authentication-operator\"/\"serving-cert\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.536131 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-apiserver\"/\"kube-root-ca.crt\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.655582 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.715780 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.757567 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.771047 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.787656 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"olm-operator-serviceaccount-dockercfg-4gqzj\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.827577 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.839633 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b75ff674b-bdglf"] Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.856517 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-bjqfd\"" Mar 22 00:13:00 crc kubenswrapper[5116]: I0322 00:13:00.975255 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-version\"/\"kube-root-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.021963 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.029990 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-controller-dockercfg-xnj77\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.124476 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b75ff674b-bdglf"] Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.201310 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-etcd-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.201935 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-6c46w\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.347861 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-controller-manager-operator\"/\"openshift-controller-manager-operator-dockercfg-jcmfj\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.358625 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" event={"ID":"58f868be-d7d6-4e45-96b8-49fb29023df0","Type":"ContainerStarted","Data":"8fe53e940e9eae97000069792a4b484816e6c60558cfba780da749fcb2830fe5"} Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.384015 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.417746 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.425088 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-operator-tls\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.463055 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.479028 5116 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.539210 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"package-server-manager-serving-cert\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.570448 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"mco-proxy-tls\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.852754 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-scheduler-operator-serving-cert\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.927275 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 22 00:13:01 crc kubenswrapper[5116]: I0322 00:13:01.998692 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-dockercfg-2wbn2\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.032135 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.032302 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-config-operator\"/\"config-operator-serving-cert\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.071505 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tk7bt\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.248911 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"client-ca\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.274631 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"kube-root-ca.crt\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.290908 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-controller-manager-operator\"/\"kube-controller-manager-operator-serving-cert\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.365380 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" event={"ID":"58f868be-d7d6-4e45-96b8-49fb29023df0","Type":"ContainerStarted","Data":"c4c2eb53d08e3ad6c104a996b921878333870c0fc1c07a0372d96ac12b016924"} Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.365753 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.368444 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-scheduler-operator\"/\"openshift-kube-scheduler-operator-config\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.373669 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.388691 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6b75ff674b-bdglf" podStartSLOduration=63.388673777 podStartE2EDuration="1m3.388673777s" podCreationTimestamp="2026-03-22 00:11:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:02.387518658 +0000 UTC m=+253.409820021" watchObservedRunningTime="2026-03-22 00:13:02.388673777 +0000 UTC m=+253.410975150" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.587612 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-controller-manager\"/\"config\"" Mar 22 00:13:02 crc kubenswrapper[5116]: I0322 00:13:02.732341 5116 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.377243 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f7dbc7e1ee9c187a863ef9b473fad27b/startup-monitor/0.log" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.377457 5116 generic.go:358] "Generic (PLEG): container finished" podID="f7dbc7e1ee9c187a863ef9b473fad27b" containerID="5cb08176408e0f1bc306dc6b2afd3ce9b0c721a85dbb42d8ee9b826aa3eecff5" exitCode=137 Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.432747 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f7dbc7e1ee9c187a863ef9b473fad27b/startup-monitor/0.log" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.432857 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569278 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569406 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569401 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569451 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569490 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569596 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") pod \"f7dbc7e1ee9c187a863ef9b473fad27b\" (UID: \"f7dbc7e1ee9c187a863ef9b473fad27b\") " Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569488 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests" (OuterVolumeSpecName: "manifests") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569522 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock" (OuterVolumeSpecName: "var-lock") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.569731 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log" (OuterVolumeSpecName: "var-log") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.570529 5116 reconciler_common.go:299] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.570579 5116 reconciler_common.go:299] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-manifests\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.570609 5116 reconciler_common.go:299] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-lock\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.570631 5116 reconciler_common.go:299] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-var-log\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.582841 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f7dbc7e1ee9c187a863ef9b473fad27b" (UID: "f7dbc7e1ee9c187a863ef9b473fad27b"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:13:04 crc kubenswrapper[5116]: I0322 00:13:04.672034 5116 reconciler_common.go:299] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f7dbc7e1ee9c187a863ef9b473fad27b-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.384655 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f7dbc7e1ee9c187a863ef9b473fad27b/startup-monitor/0.log" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.384992 5116 scope.go:117] "RemoveContainer" containerID="5cb08176408e0f1bc306dc6b2afd3ce9b0c721a85dbb42d8ee9b826aa3eecff5" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.385201 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.704972 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" path="/var/lib/kubelet/pods/f7dbc7e1ee9c187a863ef9b473fad27b/volumes" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.705514 5116 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.716663 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.716710 5116 kubelet.go:2759] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="76689ad2-1282-4ff1-bc50-5fa527bbd4d2" Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.720011 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 22 00:13:05 crc kubenswrapper[5116]: I0322 00:13:05.720049 5116 kubelet.go:2784] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="76689ad2-1282-4ff1-bc50-5fa527bbd4d2" Mar 22 00:13:13 crc kubenswrapper[5116]: I0322 00:13:13.914579 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-apiserver\"/\"encryption-config-1\"" Mar 22 00:13:16 crc kubenswrapper[5116]: I0322 00:13:16.304975 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Mar 22 00:13:17 crc kubenswrapper[5116]: I0322 00:13:17.621909 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:13:17 crc kubenswrapper[5116]: I0322 00:13:17.623020 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerName="controller-manager" containerID="cri-o://f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde" gracePeriod=30 Mar 22 00:13:17 crc kubenswrapper[5116]: I0322 00:13:17.640359 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:13:17 crc kubenswrapper[5116]: I0322 00:13:17.640894 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerName="route-controller-manager" containerID="cri-o://9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235" gracePeriod=30 Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.011839 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.027616 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.040568 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041160 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerName="route-controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041206 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerName="route-controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041237 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041245 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041271 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerName="controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041279 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerName="controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041375 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerName="route-controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041395 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerName="controller-manager" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.041406 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7dbc7e1ee9c187a863ef9b473fad27b" containerName="startup-monitor" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.044856 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.055322 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.065723 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.074844 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.080814 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144608 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144666 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144756 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvntc\" (UniqueName: \"kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144819 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert\") pod \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144847 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp728\" (UniqueName: \"kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728\") pod \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144882 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca\") pod \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144931 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp\") pod \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144951 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.144995 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145030 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config\") pod \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\" (UID: \"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145054 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config\") pod \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\" (UID: \"5f51f3b4-6887-42b5-ad77-5a2f349a162a\") " Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145090 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp" (OuterVolumeSpecName: "tmp") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145587 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp" (OuterVolumeSpecName: "tmp") pod "5f51f3b4-6887-42b5-ad77-5a2f349a162a" (UID: "5f51f3b4-6887-42b5-ad77-5a2f349a162a"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145627 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145721 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca" (OuterVolumeSpecName: "client-ca") pod "5f51f3b4-6887-42b5-ad77-5a2f349a162a" (UID: "5f51f3b4-6887-42b5-ad77-5a2f349a162a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145726 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145820 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145852 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145946 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.145989 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146033 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146227 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146242 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config" (OuterVolumeSpecName: "config") pod "5f51f3b4-6887-42b5-ad77-5a2f349a162a" (UID: "5f51f3b4-6887-42b5-ad77-5a2f349a162a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146354 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146422 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd44d\" (UniqueName: \"kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146446 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvplp\" (UniqueName: \"kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146570 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146591 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f51f3b4-6887-42b5-ad77-5a2f349a162a-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146603 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f51f3b4-6887-42b5-ad77-5a2f349a162a-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146615 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146766 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca" (OuterVolumeSpecName: "client-ca") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146814 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.146881 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config" (OuterVolumeSpecName: "config") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.151841 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc" (OuterVolumeSpecName: "kube-api-access-wvntc") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "kube-api-access-wvntc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.152001 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728" (OuterVolumeSpecName: "kube-api-access-qp728") pod "5f51f3b4-6887-42b5-ad77-5a2f349a162a" (UID: "5f51f3b4-6887-42b5-ad77-5a2f349a162a"). InnerVolumeSpecName "kube-api-access-qp728". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.152082 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5f51f3b4-6887-42b5-ad77-5a2f349a162a" (UID: "5f51f3b4-6887-42b5-ad77-5a2f349a162a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.152230 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" (UID: "ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.247945 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248026 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248077 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248117 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd44d\" (UniqueName: \"kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248370 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvplp\" (UniqueName: \"kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248463 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248526 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248576 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.248604 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.249032 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251293 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.249034 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.250341 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251309 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.249388 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251466 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251518 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251536 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251552 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wvntc\" (UniqueName: \"kubernetes.io/projected/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-kube-api-access-wvntc\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251594 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f51f3b4-6887-42b5-ad77-5a2f349a162a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251609 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qp728\" (UniqueName: \"kubernetes.io/projected/5f51f3b4-6887-42b5-ad77-5a2f349a162a-kube-api-access-qp728\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.251622 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.252111 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.252303 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.252407 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.254078 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.254212 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.266054 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd44d\" (UniqueName: \"kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d\") pod \"route-controller-manager-6bd55b6c86-s5xv6\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.267355 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvplp\" (UniqueName: \"kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp\") pod \"controller-manager-5b5ff9bffd-8qd88\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.378854 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.394912 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.472916 5116 generic.go:358] "Generic (PLEG): container finished" podID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" containerID="9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235" exitCode=0 Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.473078 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" event={"ID":"5f51f3b4-6887-42b5-ad77-5a2f349a162a","Type":"ContainerDied","Data":"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235"} Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.473114 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" event={"ID":"5f51f3b4-6887-42b5-ad77-5a2f349a162a","Type":"ContainerDied","Data":"b7695e77d66bbd117e7c0ab58ba5188e43f6d887ec5e97574683b7c2f512fe56"} Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.473140 5116 scope.go:117] "RemoveContainer" containerID="9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.475512 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.486125 5116 generic.go:358] "Generic (PLEG): container finished" podID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerID="a6ce4d9e4fad533d2bf3efd803ec0a13bd8f8eba0dea6f48dfd40baebf8e74cc" exitCode=0 Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.486217 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerDied","Data":"a6ce4d9e4fad533d2bf3efd803ec0a13bd8f8eba0dea6f48dfd40baebf8e74cc"} Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.487792 5116 generic.go:358] "Generic (PLEG): container finished" podID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" containerID="f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde" exitCode=0 Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.487936 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" event={"ID":"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513","Type":"ContainerDied","Data":"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde"} Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.488061 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" event={"ID":"ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513","Type":"ContainerDied","Data":"4a8a88fe9fa050abb0479c637d1e4e232ab389aa2a939c4d9f3135fe99408731"} Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.488273 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65b6cccf98-9kdkj" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.532659 5116 scope.go:117] "RemoveContainer" containerID="a6ce4d9e4fad533d2bf3efd803ec0a13bd8f8eba0dea6f48dfd40baebf8e74cc" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.570618 5116 scope.go:117] "RemoveContainer" containerID="9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235" Mar 22 00:13:18 crc kubenswrapper[5116]: E0322 00:13:18.571180 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235\": container with ID starting with 9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235 not found: ID does not exist" containerID="9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.571233 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235"} err="failed to get container status \"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235\": rpc error: code = NotFound desc = could not find container \"9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235\": container with ID starting with 9114a650032efd74a5e2a932e1a4e1b3463dd65c69ea801f51e4fffd08b48235 not found: ID does not exist" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.571258 5116 scope.go:117] "RemoveContainer" containerID="f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.604648 5116 scope.go:117] "RemoveContainer" containerID="f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde" Mar 22 00:13:18 crc kubenswrapper[5116]: E0322 00:13:18.607045 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde\": container with ID starting with f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde not found: ID does not exist" containerID="f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.607098 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde"} err="failed to get container status \"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde\": rpc error: code = NotFound desc = could not find container \"f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde\": container with ID starting with f1fb21485888c6408efe6382b9ad0e01400a965e3e20a4bb5ea08ffa6332ecde not found: ID does not exist" Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.628543 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.633067 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-776cdc94d6-fw5k5"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.644199 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.650160 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-65b6cccf98-9kdkj"] Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.693828 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:18 crc kubenswrapper[5116]: W0322 00:13:18.700423 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd576ea5f_6d26_4c36_8d3d_1efeed9d5691.slice/crio-3fc069942f55aab73b08b732d7bfd6ba14281b7bcdf089d7e4a46f49bc7f68af WatchSource:0}: Error finding container 3fc069942f55aab73b08b732d7bfd6ba14281b7bcdf089d7e4a46f49bc7f68af: Status 404 returned error can't find the container with id 3fc069942f55aab73b08b732d7bfd6ba14281b7bcdf089d7e4a46f49bc7f68af Mar 22 00:13:18 crc kubenswrapper[5116]: I0322 00:13:18.750108 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:18 crc kubenswrapper[5116]: W0322 00:13:18.751927 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51169795_1332_4ee1_94c0_c2f58d62de92.slice/crio-89205bef1b635d4e40295ff0aadf34fc6300396f7696726f6eed2a32f962fb39 WatchSource:0}: Error finding container 89205bef1b635d4e40295ff0aadf34fc6300396f7696726f6eed2a32f962fb39: Status 404 returned error can't find the container with id 89205bef1b635d4e40295ff0aadf34fc6300396f7696726f6eed2a32f962fb39 Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.494265 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" event={"ID":"51169795-1332-4ee1-94c0-c2f58d62de92","Type":"ContainerStarted","Data":"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad"} Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.494729 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" event={"ID":"51169795-1332-4ee1-94c0-c2f58d62de92","Type":"ContainerStarted","Data":"89205bef1b635d4e40295ff0aadf34fc6300396f7696726f6eed2a32f962fb39"} Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.494770 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.496883 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" event={"ID":"d576ea5f-6d26-4c36-8d3d-1efeed9d5691","Type":"ContainerStarted","Data":"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3"} Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.496922 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" event={"ID":"d576ea5f-6d26-4c36-8d3d-1efeed9d5691","Type":"ContainerStarted","Data":"3fc069942f55aab73b08b732d7bfd6ba14281b7bcdf089d7e4a46f49bc7f68af"} Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.497138 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.500889 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.501328 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerStarted","Data":"a871149953aa28693bec7b6f245c349dbf7ca02a0950b383ad44700084b4c923"} Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.501761 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.507827 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.507974 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.543260 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" podStartSLOduration=2.543240684 podStartE2EDuration="2.543240684s" podCreationTimestamp="2026-03-22 00:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:19.518985448 +0000 UTC m=+270.541286921" watchObservedRunningTime="2026-03-22 00:13:19.543240684 +0000 UTC m=+270.565542067" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.552907 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" podStartSLOduration=2.5528916649999998 podStartE2EDuration="2.552891665s" podCreationTimestamp="2026-03-22 00:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:19.551312509 +0000 UTC m=+270.573613882" watchObservedRunningTime="2026-03-22 00:13:19.552891665 +0000 UTC m=+270.575193038" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.703499 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f51f3b4-6887-42b5-ad77-5a2f349a162a" path="/var/lib/kubelet/pods/5f51f3b4-6887-42b5-ad77-5a2f349a162a/volumes" Mar 22 00:13:19 crc kubenswrapper[5116]: I0322 00:13:19.704251 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513" path="/var/lib/kubelet/pods/ecdf8fd4-dbf9-45a4-9b37-ed2e14cca513/volumes" Mar 22 00:13:21 crc kubenswrapper[5116]: I0322 00:13:21.331974 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:21 crc kubenswrapper[5116]: I0322 00:13:21.341950 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.522885 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" podUID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" containerName="route-controller-manager" containerID="cri-o://556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3" gracePeriod=30 Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.523516 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" podUID="51169795-1332-4ee1-94c0-c2f58d62de92" containerName="controller-manager" containerID="cri-o://7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad" gracePeriod=30 Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.809126 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-qqw4z\"" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.889274 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.911774 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918009 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c55489586-7pf6k"] Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918546 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51169795-1332-4ee1-94c0-c2f58d62de92" containerName="controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918566 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="51169795-1332-4ee1-94c0-c2f58d62de92" containerName="controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918588 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" containerName="route-controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918595 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" containerName="route-controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918704 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" containerName="route-controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.918717 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="51169795-1332-4ee1-94c0-c2f58d62de92" containerName="controller-manager" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.925435 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:22 crc kubenswrapper[5116]: I0322 00:13:22.959313 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c55489586-7pf6k"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.009053 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.013387 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.013528 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.015970 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert\") pod \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016030 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca\") pod \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016080 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016106 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016188 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016233 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp\") pod \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016258 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016337 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016365 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvplp\" (UniqueName: \"kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp\") pod \"51169795-1332-4ee1-94c0-c2f58d62de92\" (UID: \"51169795-1332-4ee1-94c0-c2f58d62de92\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016421 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config\") pod \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016485 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd44d\" (UniqueName: \"kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d\") pod \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\" (UID: \"d576ea5f-6d26-4c36-8d3d-1efeed9d5691\") " Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016687 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-config\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016737 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25s54\" (UniqueName: \"kubernetes.io/projected/086874e0-a0bb-4e3d-b08a-ff841931a631-kube-api-access-25s54\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016768 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/086874e0-a0bb-4e3d-b08a-ff841931a631-serving-cert\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016792 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-proxy-ca-bundles\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016820 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/086874e0-a0bb-4e3d-b08a-ff841931a631-tmp\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.016852 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-client-ca\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.019589 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca" (OuterVolumeSpecName: "client-ca") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.019828 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca" (OuterVolumeSpecName: "client-ca") pod "d576ea5f-6d26-4c36-8d3d-1efeed9d5691" (UID: "d576ea5f-6d26-4c36-8d3d-1efeed9d5691"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.020097 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.020244 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp" (OuterVolumeSpecName: "tmp") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.020580 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config" (OuterVolumeSpecName: "config") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.020706 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp" (OuterVolumeSpecName: "tmp") pod "d576ea5f-6d26-4c36-8d3d-1efeed9d5691" (UID: "d576ea5f-6d26-4c36-8d3d-1efeed9d5691"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.020765 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config" (OuterVolumeSpecName: "config") pod "d576ea5f-6d26-4c36-8d3d-1efeed9d5691" (UID: "d576ea5f-6d26-4c36-8d3d-1efeed9d5691"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.023549 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp" (OuterVolumeSpecName: "kube-api-access-mvplp") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "kube-api-access-mvplp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.024216 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "51169795-1332-4ee1-94c0-c2f58d62de92" (UID: "51169795-1332-4ee1-94c0-c2f58d62de92"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.024302 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d576ea5f-6d26-4c36-8d3d-1efeed9d5691" (UID: "d576ea5f-6d26-4c36-8d3d-1efeed9d5691"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.025750 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d" (OuterVolumeSpecName: "kube-api-access-vd44d") pod "d576ea5f-6d26-4c36-8d3d-1efeed9d5691" (UID: "d576ea5f-6d26-4c36-8d3d-1efeed9d5691"). InnerVolumeSpecName "kube-api-access-vd44d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.057432 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.057499 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.057547 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.058112 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.058197 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638" gracePeriod=600 Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117550 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117600 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2shz\" (UniqueName: \"kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117649 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117694 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-config\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117724 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117791 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25s54\" (UniqueName: \"kubernetes.io/projected/086874e0-a0bb-4e3d-b08a-ff841931a631-kube-api-access-25s54\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117811 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117840 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/086874e0-a0bb-4e3d-b08a-ff841931a631-serving-cert\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117858 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-proxy-ca-bundles\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117877 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/086874e0-a0bb-4e3d-b08a-ff841931a631-tmp\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117897 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-client-ca\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117944 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vd44d\" (UniqueName: \"kubernetes.io/projected/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-kube-api-access-vd44d\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117955 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117963 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117972 5116 reconciler_common.go:299] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117980 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51169795-1332-4ee1-94c0-c2f58d62de92-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117987 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.117995 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.118003 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51169795-1332-4ee1-94c0-c2f58d62de92-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.118011 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51169795-1332-4ee1-94c0-c2f58d62de92-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.118019 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mvplp\" (UniqueName: \"kubernetes.io/projected/51169795-1332-4ee1-94c0-c2f58d62de92-kube-api-access-mvplp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.118027 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d576ea5f-6d26-4c36-8d3d-1efeed9d5691-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.118900 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/086874e0-a0bb-4e3d-b08a-ff841931a631-tmp\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.119305 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-config\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.119443 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-client-ca\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.119550 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/086874e0-a0bb-4e3d-b08a-ff841931a631-proxy-ca-bundles\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.121586 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/086874e0-a0bb-4e3d-b08a-ff841931a631-serving-cert\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.134361 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25s54\" (UniqueName: \"kubernetes.io/projected/086874e0-a0bb-4e3d-b08a-ff841931a631-kube-api-access-25s54\") pod \"controller-manager-7c55489586-7pf6k\" (UID: \"086874e0-a0bb-4e3d-b08a-ff841931a631\") " pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.218916 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.220576 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2shz\" (UniqueName: \"kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.220481 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.222264 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.222319 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.223098 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.223237 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.223602 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.227533 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.240757 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2shz\" (UniqueName: \"kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz\") pod \"route-controller-manager-7667c5b846-cb649\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.280484 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.333906 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.485621 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c55489586-7pf6k"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.530753 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" event={"ID":"086874e0-a0bb-4e3d-b08a-ff841931a631","Type":"ContainerStarted","Data":"dc029588657b553f65a348cc7284f02ac2055add166776a383e724f1320a617e"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.533747 5116 generic.go:358] "Generic (PLEG): container finished" podID="51169795-1332-4ee1-94c0-c2f58d62de92" containerID="7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad" exitCode=0 Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.533934 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.536337 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" event={"ID":"51169795-1332-4ee1-94c0-c2f58d62de92","Type":"ContainerDied","Data":"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.536372 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88" event={"ID":"51169795-1332-4ee1-94c0-c2f58d62de92","Type":"ContainerDied","Data":"89205bef1b635d4e40295ff0aadf34fc6300396f7696726f6eed2a32f962fb39"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.536402 5116 scope.go:117] "RemoveContainer" containerID="7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.552261 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638" exitCode=0 Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.552389 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.552420 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.557314 5116 generic.go:358] "Generic (PLEG): container finished" podID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" containerID="556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3" exitCode=0 Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.557494 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" event={"ID":"d576ea5f-6d26-4c36-8d3d-1efeed9d5691","Type":"ContainerDied","Data":"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.557539 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" event={"ID":"d576ea5f-6d26-4c36-8d3d-1efeed9d5691","Type":"ContainerDied","Data":"3fc069942f55aab73b08b732d7bfd6ba14281b7bcdf089d7e4a46f49bc7f68af"} Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.557628 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.573386 5116 scope.go:117] "RemoveContainer" containerID="7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad" Mar 22 00:13:23 crc kubenswrapper[5116]: E0322 00:13:23.574041 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad\": container with ID starting with 7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad not found: ID does not exist" containerID="7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.574085 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad"} err="failed to get container status \"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad\": rpc error: code = NotFound desc = could not find container \"7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad\": container with ID starting with 7563ae0b361008c176c05f6e7bd3fc9a0ee5e090cb5690b0be49c13d5c140fad not found: ID does not exist" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.574108 5116 scope.go:117] "RemoveContainer" containerID="556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.586806 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.590072 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.593008 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b5ff9bffd-8qd88"] Mar 22 00:13:23 crc kubenswrapper[5116]: W0322 00:13:23.600674 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07a2bcaa_7073_4b67_bd66_80d71ec35171.slice/crio-1eb6513c1a20aa0a7c68cfa5fa675cab6f1c6aa6437fc6fd4d30bd3bc79717da WatchSource:0}: Error finding container 1eb6513c1a20aa0a7c68cfa5fa675cab6f1c6aa6437fc6fd4d30bd3bc79717da: Status 404 returned error can't find the container with id 1eb6513c1a20aa0a7c68cfa5fa675cab6f1c6aa6437fc6fd4d30bd3bc79717da Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.617081 5116 scope.go:117] "RemoveContainer" containerID="556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3" Mar 22 00:13:23 crc kubenswrapper[5116]: E0322 00:13:23.618787 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3\": container with ID starting with 556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3 not found: ID does not exist" containerID="556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.618822 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3"} err="failed to get container status \"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3\": rpc error: code = NotFound desc = could not find container \"556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3\": container with ID starting with 556604124c289c80f20b9f13945f1d1714f519db60aa3df20ac283467865a9c3 not found: ID does not exist" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.624361 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.629525 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6bd55b6c86-s5xv6"] Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.703250 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51169795-1332-4ee1-94c0-c2f58d62de92" path="/var/lib/kubelet/pods/51169795-1332-4ee1-94c0-c2f58d62de92/volumes" Mar 22 00:13:23 crc kubenswrapper[5116]: I0322 00:13:23.706566 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d576ea5f-6d26-4c36-8d3d-1efeed9d5691" path="/var/lib/kubelet/pods/d576ea5f-6d26-4c36-8d3d-1efeed9d5691/volumes" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.566607 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" event={"ID":"086874e0-a0bb-4e3d-b08a-ff841931a631","Type":"ContainerStarted","Data":"daddf578ea7897b2fcc71cca1907b0429901ca877ff2e97584b0ce47cae89286"} Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.567736 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.569535 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" event={"ID":"07a2bcaa-7073-4b67-bd66-80d71ec35171","Type":"ContainerStarted","Data":"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15"} Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.569561 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" event={"ID":"07a2bcaa-7073-4b67-bd66-80d71ec35171","Type":"ContainerStarted","Data":"1eb6513c1a20aa0a7c68cfa5fa675cab6f1c6aa6437fc6fd4d30bd3bc79717da"} Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.570316 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.580343 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.589322 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c55489586-7pf6k" podStartSLOduration=3.589308178 podStartE2EDuration="3.589308178s" podCreationTimestamp="2026-03-22 00:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:24.587316167 +0000 UTC m=+275.609617580" watchObservedRunningTime="2026-03-22 00:13:24.589308178 +0000 UTC m=+275.611609551" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.625835 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" podStartSLOduration=2.6258122459999997 podStartE2EDuration="2.625812246s" podCreationTimestamp="2026-03-22 00:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:24.624543811 +0000 UTC m=+275.646845194" watchObservedRunningTime="2026-03-22 00:13:24.625812246 +0000 UTC m=+275.648113629" Mar 22 00:13:24 crc kubenswrapper[5116]: I0322 00:13:24.650393 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:26 crc kubenswrapper[5116]: I0322 00:13:26.535052 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"pprof-cert\"" Mar 22 00:13:30 crc kubenswrapper[5116]: I0322 00:13:30.867776 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-machine-config-operator\"/\"machine-config-daemon-dockercfg-w9nzh\"" Mar 22 00:13:31 crc kubenswrapper[5116]: I0322 00:13:31.117461 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:13:36 crc kubenswrapper[5116]: I0322 00:13:36.261451 5116 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 22 00:13:37 crc kubenswrapper[5116]: I0322 00:13:37.643342 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:37 crc kubenswrapper[5116]: I0322 00:13:37.643618 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" podUID="07a2bcaa-7073-4b67-bd66-80d71ec35171" containerName="route-controller-manager" containerID="cri-o://ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15" gracePeriod=30 Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.081457 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.106181 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79"] Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.106953 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07a2bcaa-7073-4b67-bd66-80d71ec35171" containerName="route-controller-manager" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.106976 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a2bcaa-7073-4b67-bd66-80d71ec35171" containerName="route-controller-manager" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.107122 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="07a2bcaa-7073-4b67-bd66-80d71ec35171" containerName="route-controller-manager" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.114868 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.122476 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79"] Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123046 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2shz\" (UniqueName: \"kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz\") pod \"07a2bcaa-7073-4b67-bd66-80d71ec35171\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123148 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config\") pod \"07a2bcaa-7073-4b67-bd66-80d71ec35171\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123277 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp\") pod \"07a2bcaa-7073-4b67-bd66-80d71ec35171\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123347 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca\") pod \"07a2bcaa-7073-4b67-bd66-80d71ec35171\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123381 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert\") pod \"07a2bcaa-7073-4b67-bd66-80d71ec35171\" (UID: \"07a2bcaa-7073-4b67-bd66-80d71ec35171\") " Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123557 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-config\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123629 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-client-ca\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123696 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2mx7\" (UniqueName: \"kubernetes.io/projected/e0cb0732-e531-42fd-a042-7d691a4292ed-kube-api-access-h2mx7\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123736 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0cb0732-e531-42fd-a042-7d691a4292ed-tmp\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.123765 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0cb0732-e531-42fd-a042-7d691a4292ed-serving-cert\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.128580 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp" (OuterVolumeSpecName: "tmp") pod "07a2bcaa-7073-4b67-bd66-80d71ec35171" (UID: "07a2bcaa-7073-4b67-bd66-80d71ec35171"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.129200 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca" (OuterVolumeSpecName: "client-ca") pod "07a2bcaa-7073-4b67-bd66-80d71ec35171" (UID: "07a2bcaa-7073-4b67-bd66-80d71ec35171"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.129210 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config" (OuterVolumeSpecName: "config") pod "07a2bcaa-7073-4b67-bd66-80d71ec35171" (UID: "07a2bcaa-7073-4b67-bd66-80d71ec35171"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.139074 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "07a2bcaa-7073-4b67-bd66-80d71ec35171" (UID: "07a2bcaa-7073-4b67-bd66-80d71ec35171"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.139078 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz" (OuterVolumeSpecName: "kube-api-access-w2shz") pod "07a2bcaa-7073-4b67-bd66-80d71ec35171" (UID: "07a2bcaa-7073-4b67-bd66-80d71ec35171"). InnerVolumeSpecName "kube-api-access-w2shz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.224920 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-config\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.224985 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-client-ca\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225032 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2mx7\" (UniqueName: \"kubernetes.io/projected/e0cb0732-e531-42fd-a042-7d691a4292ed-kube-api-access-h2mx7\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225057 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0cb0732-e531-42fd-a042-7d691a4292ed-tmp\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225080 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0cb0732-e531-42fd-a042-7d691a4292ed-serving-cert\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225123 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2shz\" (UniqueName: \"kubernetes.io/projected/07a2bcaa-7073-4b67-bd66-80d71ec35171-kube-api-access-w2shz\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225137 5116 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225146 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07a2bcaa-7073-4b67-bd66-80d71ec35171-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225262 5116 reconciler_common.go:299] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07a2bcaa-7073-4b67-bd66-80d71ec35171-client-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225489 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0cb0732-e531-42fd-a042-7d691a4292ed-tmp\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225272 5116 reconciler_common.go:299] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07a2bcaa-7073-4b67-bd66-80d71ec35171-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.225871 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-client-ca\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.226689 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0cb0732-e531-42fd-a042-7d691a4292ed-config\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.230143 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0cb0732-e531-42fd-a042-7d691a4292ed-serving-cert\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.240228 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2mx7\" (UniqueName: \"kubernetes.io/projected/e0cb0732-e531-42fd-a042-7d691a4292ed-kube-api-access-h2mx7\") pod \"route-controller-manager-6c8bf966f9-49m79\" (UID: \"e0cb0732-e531-42fd-a042-7d691a4292ed\") " pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.472194 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.661243 5116 generic.go:358] "Generic (PLEG): container finished" podID="07a2bcaa-7073-4b67-bd66-80d71ec35171" containerID="ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15" exitCode=0 Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.661353 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" event={"ID":"07a2bcaa-7073-4b67-bd66-80d71ec35171","Type":"ContainerDied","Data":"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15"} Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.661690 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" event={"ID":"07a2bcaa-7073-4b67-bd66-80d71ec35171","Type":"ContainerDied","Data":"1eb6513c1a20aa0a7c68cfa5fa675cab6f1c6aa6437fc6fd4d30bd3bc79717da"} Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.661719 5116 scope.go:117] "RemoveContainer" containerID="ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.661432 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.690187 5116 scope.go:117] "RemoveContainer" containerID="ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15" Mar 22 00:13:38 crc kubenswrapper[5116]: E0322 00:13:38.690720 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15\": container with ID starting with ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15 not found: ID does not exist" containerID="ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.690756 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15"} err="failed to get container status \"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15\": rpc error: code = NotFound desc = could not find container \"ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15\": container with ID starting with ff4355c965e1f733ee45286e6806d457f6952cf87aeffaae72ed1bca331a3c15 not found: ID does not exist" Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.697914 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.703356 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7667c5b846-cb649"] Mar 22 00:13:38 crc kubenswrapper[5116]: I0322 00:13:38.961935 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79"] Mar 22 00:13:39 crc kubenswrapper[5116]: I0322 00:13:39.671104 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" event={"ID":"e0cb0732-e531-42fd-a042-7d691a4292ed","Type":"ContainerStarted","Data":"20f72b984295860c2949f5663643d60c6697022c401296c0a5b1e5299f237b02"} Mar 22 00:13:39 crc kubenswrapper[5116]: I0322 00:13:39.671430 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" event={"ID":"e0cb0732-e531-42fd-a042-7d691a4292ed","Type":"ContainerStarted","Data":"2d8fcd89b7d6ed557a003f7743e9c81d54cb4051f50142ca977de2db85b69ed3"} Mar 22 00:13:39 crc kubenswrapper[5116]: I0322 00:13:39.692262 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" podStartSLOduration=2.692230781 podStartE2EDuration="2.692230781s" podCreationTimestamp="2026-03-22 00:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:39.68539663 +0000 UTC m=+290.707698003" watchObservedRunningTime="2026-03-22 00:13:39.692230781 +0000 UTC m=+290.714532204" Mar 22 00:13:39 crc kubenswrapper[5116]: I0322 00:13:39.717433 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a2bcaa-7073-4b67-bd66-80d71ec35171" path="/var/lib/kubelet/pods/07a2bcaa-7073-4b67-bd66-80d71ec35171/volumes" Mar 22 00:13:40 crc kubenswrapper[5116]: I0322 00:13:40.676957 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:40 crc kubenswrapper[5116]: I0322 00:13:40.683766 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c8bf966f9-49m79" Mar 22 00:13:40 crc kubenswrapper[5116]: I0322 00:13:40.700854 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 22 00:13:49 crc kubenswrapper[5116]: I0322 00:13:49.846794 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:13:49 crc kubenswrapper[5116]: I0322 00:13:49.847873 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.287548 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.288376 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t4x6l" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="registry-server" containerID="cri-o://fcd6dc8707ea620677d752dcbac5f99025670bc7fb912c816a48189214a21722" gracePeriod=30 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.303452 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.304031 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zrcmf" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="registry-server" containerID="cri-o://58537fdac1ecd65bbab90d0d56679e194c7535455e823fc1086706266b101727" gracePeriod=30 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.312817 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.313111 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" containerID="cri-o://a871149953aa28693bec7b6f245c349dbf7ca02a0950b383ad44700084b4c923" gracePeriod=30 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.335142 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.335542 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kp7rb" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="registry-server" containerID="cri-o://c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" gracePeriod=30 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.341100 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.341547 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wss9d" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="registry-server" containerID="cri-o://a7f6a2282b3b24b75800d6fb74f2f5336f0fdfcc664abae1ceae752dc767bce9" gracePeriod=30 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.349027 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-c8r5t"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.356697 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-c8r5t"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.356809 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.515978 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.516034 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-tmp\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.516075 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.516094 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mdmx\" (UniqueName: \"kubernetes.io/projected/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-kube-api-access-2mdmx\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.617616 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-tmp\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.617691 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.617717 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mdmx\" (UniqueName: \"kubernetes.io/projected/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-kube-api-access-2mdmx\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.617793 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.618362 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-tmp\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.618974 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-trusted-ca\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.634298 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mdmx\" (UniqueName: \"kubernetes.io/projected/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-kube-api-access-2mdmx\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.635350 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ba8a6a03-e32a-4121-86e1-d856ddf7a73b-marketplace-operator-metrics\") pod \"marketplace-operator-547dbd544d-c8r5t\" (UID: \"ba8a6a03-e32a-4121-86e1-d856ddf7a73b\") " pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.722547 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:52 crc kubenswrapper[5116]: E0322 00:13:52.761417 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985 is running failed: container process not found" containerID="c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" cmd=["grpc_health_probe","-addr=:50051"] Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.761563 5116 generic.go:358] "Generic (PLEG): container finished" podID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerID="c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" exitCode=0 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.761594 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerDied","Data":"c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985"} Mar 22 00:13:52 crc kubenswrapper[5116]: E0322 00:13:52.770589 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985 is running failed: container process not found" containerID="c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" cmd=["grpc_health_probe","-addr=:50051"] Mar 22 00:13:52 crc kubenswrapper[5116]: E0322 00:13:52.772024 5116 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985 is running failed: container process not found" containerID="c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" cmd=["grpc_health_probe","-addr=:50051"] Mar 22 00:13:52 crc kubenswrapper[5116]: E0322 00:13:52.772089 5116 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-kp7rb" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="registry-server" probeResult="unknown" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.780189 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.787027 5116 generic.go:358] "Generic (PLEG): container finished" podID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerID="a7f6a2282b3b24b75800d6fb74f2f5336f0fdfcc664abae1ceae752dc767bce9" exitCode=0 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.787125 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerDied","Data":"a7f6a2282b3b24b75800d6fb74f2f5336f0fdfcc664abae1ceae752dc767bce9"} Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.804604 5116 generic.go:358] "Generic (PLEG): container finished" podID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerID="a871149953aa28693bec7b6f245c349dbf7ca02a0950b383ad44700084b4c923" exitCode=0 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.804948 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerDied","Data":"a871149953aa28693bec7b6f245c349dbf7ca02a0950b383ad44700084b4c923"} Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.804990 5116 scope.go:117] "RemoveContainer" containerID="a6ce4d9e4fad533d2bf3efd803ec0a13bd8f8eba0dea6f48dfd40baebf8e74cc" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.835287 5116 generic.go:358] "Generic (PLEG): container finished" podID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerID="58537fdac1ecd65bbab90d0d56679e194c7535455e823fc1086706266b101727" exitCode=0 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.835362 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerDied","Data":"58537fdac1ecd65bbab90d0d56679e194c7535455e823fc1086706266b101727"} Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.843457 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4x6l" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.842404 5116 generic.go:358] "Generic (PLEG): container finished" podID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerID="fcd6dc8707ea620677d752dcbac5f99025670bc7fb912c816a48189214a21722" exitCode=0 Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.845556 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4x6l" event={"ID":"da3b0eb3-e48f-4080-bfdc-522f18cf2876","Type":"ContainerDied","Data":"fcd6dc8707ea620677d752dcbac5f99025670bc7fb912c816a48189214a21722"} Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.876387 5116 scope.go:117] "RemoveContainer" containerID="fcd6dc8707ea620677d752dcbac5f99025670bc7fb912c816a48189214a21722" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.906919 5116 scope.go:117] "RemoveContainer" containerID="bd585f6a2418bf617978e44c5ded778fb5ab883949c7c6d99346b0cce7aab8d6" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.907406 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.924127 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities\") pod \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.924367 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content\") pod \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.924408 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh2v2\" (UniqueName: \"kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2\") pod \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\" (UID: \"da3b0eb3-e48f-4080-bfdc-522f18cf2876\") " Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.926148 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities" (OuterVolumeSpecName: "utilities") pod "da3b0eb3-e48f-4080-bfdc-522f18cf2876" (UID: "da3b0eb3-e48f-4080-bfdc-522f18cf2876"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.932181 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2" (OuterVolumeSpecName: "kube-api-access-zh2v2") pod "da3b0eb3-e48f-4080-bfdc-522f18cf2876" (UID: "da3b0eb3-e48f-4080-bfdc-522f18cf2876"). InnerVolumeSpecName "kube-api-access-zh2v2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.933151 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.935756 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.940759 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.941579 5116 scope.go:117] "RemoveContainer" containerID="48d478e51d471e4228cfd3bf987de6d6edc345926b49b645231a761cba1fbba7" Mar 22 00:13:52 crc kubenswrapper[5116]: I0322 00:13:52.981194 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da3b0eb3-e48f-4080-bfdc-522f18cf2876" (UID: "da3b0eb3-e48f-4080-bfdc-522f18cf2876"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.025940 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content\") pod \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026023 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8ktl\" (UniqueName: \"kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl\") pod \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026058 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca\") pod \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026089 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities\") pod \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\" (UID: \"fe41a890-8a59-4fc7-b392-b7bab2ad5832\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026120 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp\") pod \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026185 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content\") pod \"77380b82-4c44-4cfd-a7b1-e77b060af507\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026262 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities\") pod \"77380b82-4c44-4cfd-a7b1-e77b060af507\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026289 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content\") pod \"696eed68-bf2d-4bbd-865f-07998d61f8ab\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026311 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities\") pod \"696eed68-bf2d-4bbd-865f-07998d61f8ab\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026345 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cpmn\" (UniqueName: \"kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn\") pod \"77380b82-4c44-4cfd-a7b1-e77b060af507\" (UID: \"77380b82-4c44-4cfd-a7b1-e77b060af507\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026375 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics\") pod \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026416 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwwx5\" (UniqueName: \"kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5\") pod \"696eed68-bf2d-4bbd-865f-07998d61f8ab\" (UID: \"696eed68-bf2d-4bbd-865f-07998d61f8ab\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026431 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp" (OuterVolumeSpecName: "tmp") pod "23e39fb8-29b4-4a99-b189-3cd7c8e7f488" (UID: "23e39fb8-29b4-4a99-b189-3cd7c8e7f488"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026477 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crwp9\" (UniqueName: \"kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9\") pod \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\" (UID: \"23e39fb8-29b4-4a99-b189-3cd7c8e7f488\") " Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026674 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026686 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zh2v2\" (UniqueName: \"kubernetes.io/projected/da3b0eb3-e48f-4080-bfdc-522f18cf2876-kube-api-access-zh2v2\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026695 5116 reconciler_common.go:299] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-tmp\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026704 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da3b0eb3-e48f-4080-bfdc-522f18cf2876-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.026740 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "23e39fb8-29b4-4a99-b189-3cd7c8e7f488" (UID: "23e39fb8-29b4-4a99-b189-3cd7c8e7f488"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.027469 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities" (OuterVolumeSpecName: "utilities") pod "fe41a890-8a59-4fc7-b392-b7bab2ad5832" (UID: "fe41a890-8a59-4fc7-b392-b7bab2ad5832"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.027961 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities" (OuterVolumeSpecName: "utilities") pod "696eed68-bf2d-4bbd-865f-07998d61f8ab" (UID: "696eed68-bf2d-4bbd-865f-07998d61f8ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.028736 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities" (OuterVolumeSpecName: "utilities") pod "77380b82-4c44-4cfd-a7b1-e77b060af507" (UID: "77380b82-4c44-4cfd-a7b1-e77b060af507"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.028902 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl" (OuterVolumeSpecName: "kube-api-access-x8ktl") pod "fe41a890-8a59-4fc7-b392-b7bab2ad5832" (UID: "fe41a890-8a59-4fc7-b392-b7bab2ad5832"). InnerVolumeSpecName "kube-api-access-x8ktl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.030178 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn" (OuterVolumeSpecName: "kube-api-access-6cpmn") pod "77380b82-4c44-4cfd-a7b1-e77b060af507" (UID: "77380b82-4c44-4cfd-a7b1-e77b060af507"). InnerVolumeSpecName "kube-api-access-6cpmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.030550 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5" (OuterVolumeSpecName: "kube-api-access-vwwx5") pod "696eed68-bf2d-4bbd-865f-07998d61f8ab" (UID: "696eed68-bf2d-4bbd-865f-07998d61f8ab"). InnerVolumeSpecName "kube-api-access-vwwx5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.030734 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9" (OuterVolumeSpecName: "kube-api-access-crwp9") pod "23e39fb8-29b4-4a99-b189-3cd7c8e7f488" (UID: "23e39fb8-29b4-4a99-b189-3cd7c8e7f488"). InnerVolumeSpecName "kube-api-access-crwp9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.032979 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "23e39fb8-29b4-4a99-b189-3cd7c8e7f488" (UID: "23e39fb8-29b4-4a99-b189-3cd7c8e7f488"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.052677 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "696eed68-bf2d-4bbd-865f-07998d61f8ab" (UID: "696eed68-bf2d-4bbd-865f-07998d61f8ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.078971 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "77380b82-4c44-4cfd-a7b1-e77b060af507" (UID: "77380b82-4c44-4cfd-a7b1-e77b060af507"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127729 5116 reconciler_common.go:299] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127772 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwwx5\" (UniqueName: \"kubernetes.io/projected/696eed68-bf2d-4bbd-865f-07998d61f8ab-kube-api-access-vwwx5\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127790 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crwp9\" (UniqueName: \"kubernetes.io/projected/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-kube-api-access-crwp9\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127810 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8ktl\" (UniqueName: \"kubernetes.io/projected/fe41a890-8a59-4fc7-b392-b7bab2ad5832-kube-api-access-x8ktl\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127829 5116 reconciler_common.go:299] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23e39fb8-29b4-4a99-b189-3cd7c8e7f488-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127846 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127863 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127881 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/77380b82-4c44-4cfd-a7b1-e77b060af507-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127897 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127912 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/696eed68-bf2d-4bbd-865f-07998d61f8ab-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.127929 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cpmn\" (UniqueName: \"kubernetes.io/projected/77380b82-4c44-4cfd-a7b1-e77b060af507-kube-api-access-6cpmn\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.130160 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe41a890-8a59-4fc7-b392-b7bab2ad5832" (UID: "fe41a890-8a59-4fc7-b392-b7bab2ad5832"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.170422 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-c8r5t"] Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.181013 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:13:53 crc kubenswrapper[5116]: W0322 00:13:53.184269 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba8a6a03_e32a_4121_86e1_d856ddf7a73b.slice/crio-1a36953a68e288a1ce0802063a89fd0b717cb006290f3ef643ee98debd8d1d95 WatchSource:0}: Error finding container 1a36953a68e288a1ce0802063a89fd0b717cb006290f3ef643ee98debd8d1d95: Status 404 returned error can't find the container with id 1a36953a68e288a1ce0802063a89fd0b717cb006290f3ef643ee98debd8d1d95 Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.186032 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t4x6l"] Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.187557 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.229039 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe41a890-8a59-4fc7-b392-b7bab2ad5832-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.704128 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" path="/var/lib/kubelet/pods/da3b0eb3-e48f-4080-bfdc-522f18cf2876/volumes" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.869409 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kp7rb" event={"ID":"696eed68-bf2d-4bbd-865f-07998d61f8ab","Type":"ContainerDied","Data":"82447639f8664a7a9be68c50329975aae58c8919cedaf2554c6f5ebb2a14ac22"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.869482 5116 scope.go:117] "RemoveContainer" containerID="c41be0d44abbf50de3e3033796a4335c2fb7b8a34b7a76e0caaf18024ae10985" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.869532 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kp7rb" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.875086 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wss9d" event={"ID":"fe41a890-8a59-4fc7-b392-b7bab2ad5832","Type":"ContainerDied","Data":"4076d8a97891e463c22fe9847edcf67c692b44a8dbcd9aa75ba00b5c2c7fdc81"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.876355 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" event={"ID":"23e39fb8-29b4-4a99-b189-3cd7c8e7f488","Type":"ContainerDied","Data":"082b4604427fbc7c5d9bce23172c03602291dda6ccea1696f1f624d1746d3739"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.876475 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-547dbd544d-lf2zm" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.878499 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wss9d" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.878727 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrcmf" event={"ID":"77380b82-4c44-4cfd-a7b1-e77b060af507","Type":"ContainerDied","Data":"c8dbd89a41371e9d08f38390365ebbf2b2a5481a8e6093a86e6911bc41519ed3"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.878788 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrcmf" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.880645 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" event={"ID":"ba8a6a03-e32a-4121-86e1-d856ddf7a73b","Type":"ContainerStarted","Data":"7980189e946d1931331cd080fb56ff14fd80761a7e28038936449a6a3b51ce3c"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.880665 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" event={"ID":"ba8a6a03-e32a-4121-86e1-d856ddf7a73b","Type":"ContainerStarted","Data":"1a36953a68e288a1ce0802063a89fd0b717cb006290f3ef643ee98debd8d1d95"} Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.880998 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.883972 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.894293 5116 scope.go:117] "RemoveContainer" containerID="30c022eef87348aae4e9bdbc424e5f6c1baa0356ea65b1994f9191806ffd90dd" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.915843 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-547dbd544d-c8r5t" podStartSLOduration=1.915819884 podStartE2EDuration="1.915819884s" podCreationTimestamp="2026-03-22 00:13:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:13:53.907683387 +0000 UTC m=+304.929984760" watchObservedRunningTime="2026-03-22 00:13:53.915819884 +0000 UTC m=+304.938121257" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.963980 5116 scope.go:117] "RemoveContainer" containerID="60acf1b394737b7d397286a07410ebda0e8083a98b6b46d3e3761b9f5dd3c90c" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.970900 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.988865 5116 scope.go:117] "RemoveContainer" containerID="a7f6a2282b3b24b75800d6fb74f2f5336f0fdfcc664abae1ceae752dc767bce9" Mar 22 00:13:53 crc kubenswrapper[5116]: I0322 00:13:53.995255 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kp7rb"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.007909 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.010436 5116 scope.go:117] "RemoveContainer" containerID="4615b1e7e8eebd1c4efa8ba0e4d690678c50216687ca96316a048717b240afa6" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.010496 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wss9d"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.014100 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.017383 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zrcmf"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.021310 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.023022 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-547dbd544d-lf2zm"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.030331 5116 scope.go:117] "RemoveContainer" containerID="f6ccc3cf8e5e1fac21a450937d818f73d9c8ea21d213cc087495650a551817ba" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.042752 5116 scope.go:117] "RemoveContainer" containerID="a871149953aa28693bec7b6f245c349dbf7ca02a0950b383ad44700084b4c923" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.059449 5116 scope.go:117] "RemoveContainer" containerID="58537fdac1ecd65bbab90d0d56679e194c7535455e823fc1086706266b101727" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.073658 5116 scope.go:117] "RemoveContainer" containerID="3662e71bd41b60c7bbef1f51273ae388448fc2e3a846e9f692b29bbba4929dce" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.087970 5116 scope.go:117] "RemoveContainer" containerID="650e2199c9bfdd8d8095c16f7e86df5e23a5cedd710ebf3a93a1b3818c4e5743" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.506546 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qtstp"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507481 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507531 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507545 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507551 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507561 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507569 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507580 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507587 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507599 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507606 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507907 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507919 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507929 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.507967 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508004 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508012 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508024 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508031 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508054 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508064 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508075 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508084 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="extract-utilities" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508101 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508108 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="extract-content" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508119 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508126 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508139 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508146 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508276 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508286 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508299 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="da3b0eb3-e48f-4080-bfdc-522f18cf2876" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508308 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508314 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" containerName="registry-server" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.508529 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" containerName="marketplace-operator" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.529816 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qtstp"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.530119 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.532655 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"certified-operators-dockercfg-7cl8d\"" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.660724 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc8wn\" (UniqueName: \"kubernetes.io/projected/fb40c619-b024-485e-8fab-590cf66159b3-kube-api-access-mc8wn\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.660831 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-catalog-content\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.660937 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-utilities\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.704026 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.710403 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.713857 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-marketplace-dockercfg-gg4w7\"" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.721652 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.762138 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mc8wn\" (UniqueName: \"kubernetes.io/projected/fb40c619-b024-485e-8fab-590cf66159b3-kube-api-access-mc8wn\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.762231 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-catalog-content\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.762338 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-utilities\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.762848 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-catalog-content\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.762979 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb40c619-b024-485e-8fab-590cf66159b3-utilities\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.788651 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc8wn\" (UniqueName: \"kubernetes.io/projected/fb40c619-b024-485e-8fab-590cf66159b3-kube-api-access-mc8wn\") pod \"certified-operators-qtstp\" (UID: \"fb40c619-b024-485e-8fab-590cf66159b3\") " pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.849250 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.865558 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.865674 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwxf9\" (UniqueName: \"kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.865791 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.966917 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.967000 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwxf9\" (UniqueName: \"kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.967040 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.967565 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.967721 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:54 crc kubenswrapper[5116]: I0322 00:13:54.992249 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwxf9\" (UniqueName: \"kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9\") pod \"redhat-marketplace-m2vjz\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.036102 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.281844 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qtstp"] Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.424804 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:13:55 crc kubenswrapper[5116]: W0322 00:13:55.490070 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f62de57_b304_469a_ab77_b6796a6a482c.slice/crio-2bfb9b2cbc2bf9b208015b497f6aa3a527695246821234f87810cec830fcdda1 WatchSource:0}: Error finding container 2bfb9b2cbc2bf9b208015b497f6aa3a527695246821234f87810cec830fcdda1: Status 404 returned error can't find the container with id 2bfb9b2cbc2bf9b208015b497f6aa3a527695246821234f87810cec830fcdda1 Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.711615 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e39fb8-29b4-4a99-b189-3cd7c8e7f488" path="/var/lib/kubelet/pods/23e39fb8-29b4-4a99-b189-3cd7c8e7f488/volumes" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.712838 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="696eed68-bf2d-4bbd-865f-07998d61f8ab" path="/var/lib/kubelet/pods/696eed68-bf2d-4bbd-865f-07998d61f8ab/volumes" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.713662 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77380b82-4c44-4cfd-a7b1-e77b060af507" path="/var/lib/kubelet/pods/77380b82-4c44-4cfd-a7b1-e77b060af507/volumes" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.715065 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe41a890-8a59-4fc7-b392-b7bab2ad5832" path="/var/lib/kubelet/pods/fe41a890-8a59-4fc7-b392-b7bab2ad5832/volumes" Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.902444 5116 generic.go:358] "Generic (PLEG): container finished" podID="4f62de57-b304-469a-ab77-b6796a6a482c" containerID="27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e" exitCode=0 Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.902499 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerDied","Data":"27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e"} Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.902541 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerStarted","Data":"2bfb9b2cbc2bf9b208015b497f6aa3a527695246821234f87810cec830fcdda1"} Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.904027 5116 generic.go:358] "Generic (PLEG): container finished" podID="fb40c619-b024-485e-8fab-590cf66159b3" containerID="abad814a6663bed533637d502f221eb9c34b41414e0481ef65c9bc4512985738" exitCode=0 Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.904115 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtstp" event={"ID":"fb40c619-b024-485e-8fab-590cf66159b3","Type":"ContainerDied","Data":"abad814a6663bed533637d502f221eb9c34b41414e0481ef65c9bc4512985738"} Mar 22 00:13:55 crc kubenswrapper[5116]: I0322 00:13:55.904153 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtstp" event={"ID":"fb40c619-b024-485e-8fab-590cf66159b3","Type":"ContainerStarted","Data":"1a8e889707bedaf7985da62aeb60a3f1b2bd347bc6adeb83f6a726745ca9eb2c"} Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.929066 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v85n6"] Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.934743 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.935570 5116 generic.go:358] "Generic (PLEG): container finished" podID="fb40c619-b024-485e-8fab-590cf66159b3" containerID="8433eb6d124456dab2dc0752c6f375f18adb082633723a7b3fa0268293c52692" exitCode=0 Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.935701 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtstp" event={"ID":"fb40c619-b024-485e-8fab-590cf66159b3","Type":"ContainerDied","Data":"8433eb6d124456dab2dc0752c6f375f18adb082633723a7b3fa0268293c52692"} Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.939214 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v85n6"] Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.942003 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-9gxlh\"" Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.942680 5116 generic.go:358] "Generic (PLEG): container finished" podID="4f62de57-b304-469a-ab77-b6796a6a482c" containerID="6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155" exitCode=0 Mar 22 00:13:56 crc kubenswrapper[5116]: I0322 00:13:56.942758 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerDied","Data":"6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.100235 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz6lx\" (UniqueName: \"kubernetes.io/projected/132f688e-74fb-4bbb-844a-a23467633e19-kube-api-access-kz6lx\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.100310 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-catalog-content\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.100399 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-utilities\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.102434 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wkk2b"] Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.113660 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wkk2b"] Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.113810 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.116245 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"community-operators-dockercfg-vrd5f\"" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.201964 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz6lx\" (UniqueName: \"kubernetes.io/projected/132f688e-74fb-4bbb-844a-a23467633e19-kube-api-access-kz6lx\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202038 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-catalog-content\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202202 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-utilities\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202240 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-utilities\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202263 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-catalog-content\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202283 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhmcg\" (UniqueName: \"kubernetes.io/projected/9cc37111-4983-4dbc-a277-b77d2fc47508-kube-api-access-bhmcg\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202637 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-catalog-content\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.202691 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/132f688e-74fb-4bbb-844a-a23467633e19-utilities\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.232528 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz6lx\" (UniqueName: \"kubernetes.io/projected/132f688e-74fb-4bbb-844a-a23467633e19-kube-api-access-kz6lx\") pod \"redhat-operators-v85n6\" (UID: \"132f688e-74fb-4bbb-844a-a23467633e19\") " pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.270667 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.304113 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-utilities\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.304291 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-catalog-content\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.304338 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhmcg\" (UniqueName: \"kubernetes.io/projected/9cc37111-4983-4dbc-a277-b77d2fc47508-kube-api-access-bhmcg\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.304838 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-catalog-content\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.304998 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc37111-4983-4dbc-a277-b77d2fc47508-utilities\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.324304 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhmcg\" (UniqueName: \"kubernetes.io/projected/9cc37111-4983-4dbc-a277-b77d2fc47508-kube-api-access-bhmcg\") pod \"community-operators-wkk2b\" (UID: \"9cc37111-4983-4dbc-a277-b77d2fc47508\") " pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.429758 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.704570 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v85n6"] Mar 22 00:13:57 crc kubenswrapper[5116]: W0322 00:13:57.709626 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod132f688e_74fb_4bbb_844a_a23467633e19.slice/crio-e66f236caa4417060890c2598757e744510d58e5dfe754d7d108f21dc07bf922 WatchSource:0}: Error finding container e66f236caa4417060890c2598757e744510d58e5dfe754d7d108f21dc07bf922: Status 404 returned error can't find the container with id e66f236caa4417060890c2598757e744510d58e5dfe754d7d108f21dc07bf922 Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.868124 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wkk2b"] Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.949661 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qtstp" event={"ID":"fb40c619-b024-485e-8fab-590cf66159b3","Type":"ContainerStarted","Data":"e608f034634a1d6831bf39293af4171fbdc6fab08158d6d732d8693f36437a64"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.953332 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk2b" event={"ID":"9cc37111-4983-4dbc-a277-b77d2fc47508","Type":"ContainerStarted","Data":"b6167cb8692fe7c05a23fa685b3f7c66b6b19d918c2beb0018d416ac5abf2906"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.955093 5116 generic.go:358] "Generic (PLEG): container finished" podID="132f688e-74fb-4bbb-844a-a23467633e19" containerID="cac52d5e77020597f48a778f9ab7cdf9f57f7963b547fca6b30fcdf32fb1af24" exitCode=0 Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.955358 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v85n6" event={"ID":"132f688e-74fb-4bbb-844a-a23467633e19","Type":"ContainerDied","Data":"cac52d5e77020597f48a778f9ab7cdf9f57f7963b547fca6b30fcdf32fb1af24"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.955479 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v85n6" event={"ID":"132f688e-74fb-4bbb-844a-a23467633e19","Type":"ContainerStarted","Data":"e66f236caa4417060890c2598757e744510d58e5dfe754d7d108f21dc07bf922"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.963429 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerStarted","Data":"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494"} Mar 22 00:13:57 crc kubenswrapper[5116]: I0322 00:13:57.970459 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qtstp" podStartSLOduration=3.401257656 podStartE2EDuration="3.970438498s" podCreationTimestamp="2026-03-22 00:13:54 +0000 UTC" firstStartedPulling="2026-03-22 00:13:55.905234254 +0000 UTC m=+306.927535627" lastFinishedPulling="2026-03-22 00:13:56.474415056 +0000 UTC m=+307.496716469" observedRunningTime="2026-03-22 00:13:57.966053584 +0000 UTC m=+308.988354967" watchObservedRunningTime="2026-03-22 00:13:57.970438498 +0000 UTC m=+308.992739871" Mar 22 00:13:58 crc kubenswrapper[5116]: I0322 00:13:58.011062 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m2vjz" podStartSLOduration=3.444734259 podStartE2EDuration="4.011038131s" podCreationTimestamp="2026-03-22 00:13:54 +0000 UTC" firstStartedPulling="2026-03-22 00:13:55.903337806 +0000 UTC m=+306.925639179" lastFinishedPulling="2026-03-22 00:13:56.469641648 +0000 UTC m=+307.491943051" observedRunningTime="2026-03-22 00:13:58.007507997 +0000 UTC m=+309.029809370" watchObservedRunningTime="2026-03-22 00:13:58.011038131 +0000 UTC m=+309.033339504" Mar 22 00:13:58 crc kubenswrapper[5116]: I0322 00:13:58.972273 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cc37111-4983-4dbc-a277-b77d2fc47508" containerID="1e43b85690d74b47388096377dda7ee7347944fba90172e40e429b79861427bf" exitCode=0 Mar 22 00:13:58 crc kubenswrapper[5116]: I0322 00:13:58.972847 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk2b" event={"ID":"9cc37111-4983-4dbc-a277-b77d2fc47508","Type":"ContainerDied","Data":"1e43b85690d74b47388096377dda7ee7347944fba90172e40e429b79861427bf"} Mar 22 00:13:58 crc kubenswrapper[5116]: I0322 00:13:58.985823 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v85n6" event={"ID":"132f688e-74fb-4bbb-844a-a23467633e19","Type":"ContainerStarted","Data":"e9911ca0330d44857fae6b7f5bdddf69e511cad69fd13be56d12e4bb32c4cd67"} Mar 22 00:13:59 crc kubenswrapper[5116]: I0322 00:13:59.991993 5116 generic.go:358] "Generic (PLEG): container finished" podID="132f688e-74fb-4bbb-844a-a23467633e19" containerID="e9911ca0330d44857fae6b7f5bdddf69e511cad69fd13be56d12e4bb32c4cd67" exitCode=0 Mar 22 00:13:59 crc kubenswrapper[5116]: I0322 00:13:59.992114 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v85n6" event={"ID":"132f688e-74fb-4bbb-844a-a23467633e19","Type":"ContainerDied","Data":"e9911ca0330d44857fae6b7f5bdddf69e511cad69fd13be56d12e4bb32c4cd67"} Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.138568 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568974-w8j5j"] Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.160867 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568974-w8j5j"] Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.161101 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.164182 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.164805 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.165430 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.240682 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85ph5\" (UniqueName: \"kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5\") pod \"auto-csr-approver-29568974-w8j5j\" (UID: \"3b94e50a-fe81-48fe-a23a-c15956c06d21\") " pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.341905 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85ph5\" (UniqueName: \"kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5\") pod \"auto-csr-approver-29568974-w8j5j\" (UID: \"3b94e50a-fe81-48fe-a23a-c15956c06d21\") " pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.366219 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85ph5\" (UniqueName: \"kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5\") pod \"auto-csr-approver-29568974-w8j5j\" (UID: \"3b94e50a-fe81-48fe-a23a-c15956c06d21\") " pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.510521 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:00 crc kubenswrapper[5116]: I0322 00:14:00.893439 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568974-w8j5j"] Mar 22 00:14:01 crc kubenswrapper[5116]: I0322 00:14:01.000101 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" event={"ID":"3b94e50a-fe81-48fe-a23a-c15956c06d21","Type":"ContainerStarted","Data":"a83e43008d6fd84b636448e3d5d1fdd64918cee13e1036fbb41ee766e29a6fe0"} Mar 22 00:14:01 crc kubenswrapper[5116]: I0322 00:14:01.001814 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cc37111-4983-4dbc-a277-b77d2fc47508" containerID="55f8617fd0750429feb258f5f415f3d00b17bc8699b50faa2e41ad61cc896c20" exitCode=0 Mar 22 00:14:01 crc kubenswrapper[5116]: I0322 00:14:01.001916 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk2b" event={"ID":"9cc37111-4983-4dbc-a277-b77d2fc47508","Type":"ContainerDied","Data":"55f8617fd0750429feb258f5f415f3d00b17bc8699b50faa2e41ad61cc896c20"} Mar 22 00:14:01 crc kubenswrapper[5116]: I0322 00:14:01.005180 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v85n6" event={"ID":"132f688e-74fb-4bbb-844a-a23467633e19","Type":"ContainerStarted","Data":"07725b0bced148bfeb9beb2ed062e98ce22b68516822b3b06a43b5e4bc108816"} Mar 22 00:14:01 crc kubenswrapper[5116]: I0322 00:14:01.043833 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v85n6" podStartSLOduration=4.257931239 podStartE2EDuration="5.043817752s" podCreationTimestamp="2026-03-22 00:13:56 +0000 UTC" firstStartedPulling="2026-03-22 00:13:57.956290558 +0000 UTC m=+308.978591941" lastFinishedPulling="2026-03-22 00:13:58.742177081 +0000 UTC m=+309.764478454" observedRunningTime="2026-03-22 00:14:01.040463624 +0000 UTC m=+312.062765007" watchObservedRunningTime="2026-03-22 00:14:01.043817752 +0000 UTC m=+312.066119125" Mar 22 00:14:02 crc kubenswrapper[5116]: I0322 00:14:02.015588 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wkk2b" event={"ID":"9cc37111-4983-4dbc-a277-b77d2fc47508","Type":"ContainerStarted","Data":"017282eaa72fb6d6afc670e6c507047b70540289ad8c6e1e984790f43486cbe4"} Mar 22 00:14:03 crc kubenswrapper[5116]: I0322 00:14:03.029601 5116 generic.go:358] "Generic (PLEG): container finished" podID="3b94e50a-fe81-48fe-a23a-c15956c06d21" containerID="2acebccbc85d9eff1c121aca735947ed6d77f0c1bc6b89aca01a5fc1d6de9f77" exitCode=0 Mar 22 00:14:03 crc kubenswrapper[5116]: I0322 00:14:03.029654 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" event={"ID":"3b94e50a-fe81-48fe-a23a-c15956c06d21","Type":"ContainerDied","Data":"2acebccbc85d9eff1c121aca735947ed6d77f0c1bc6b89aca01a5fc1d6de9f77"} Mar 22 00:14:03 crc kubenswrapper[5116]: I0322 00:14:03.044934 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wkk2b" podStartSLOduration=5.127768927 podStartE2EDuration="6.044918904s" podCreationTimestamp="2026-03-22 00:13:57 +0000 UTC" firstStartedPulling="2026-03-22 00:13:58.983060655 +0000 UTC m=+310.005362028" lastFinishedPulling="2026-03-22 00:13:59.900210632 +0000 UTC m=+310.922512005" observedRunningTime="2026-03-22 00:14:02.04626093 +0000 UTC m=+313.068562323" watchObservedRunningTime="2026-03-22 00:14:03.044918904 +0000 UTC m=+314.067220267" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.300709 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.391900 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85ph5\" (UniqueName: \"kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5\") pod \"3b94e50a-fe81-48fe-a23a-c15956c06d21\" (UID: \"3b94e50a-fe81-48fe-a23a-c15956c06d21\") " Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.397477 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5" (OuterVolumeSpecName: "kube-api-access-85ph5") pod "3b94e50a-fe81-48fe-a23a-c15956c06d21" (UID: "3b94e50a-fe81-48fe-a23a-c15956c06d21"). InnerVolumeSpecName "kube-api-access-85ph5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.493771 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-85ph5\" (UniqueName: \"kubernetes.io/projected/3b94e50a-fe81-48fe-a23a-c15956c06d21-kube-api-access-85ph5\") on node \"crc\" DevicePath \"\"" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.850574 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.850637 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:14:04 crc kubenswrapper[5116]: I0322 00:14:04.904133 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.037201 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.038097 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.043820 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.043818 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568974-w8j5j" event={"ID":"3b94e50a-fe81-48fe-a23a-c15956c06d21","Type":"ContainerDied","Data":"a83e43008d6fd84b636448e3d5d1fdd64918cee13e1036fbb41ee766e29a6fe0"} Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.043869 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83e43008d6fd84b636448e3d5d1fdd64918cee13e1036fbb41ee766e29a6fe0" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.077761 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qtstp" Mar 22 00:14:05 crc kubenswrapper[5116]: I0322 00:14:05.080194 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:14:06 crc kubenswrapper[5116]: I0322 00:14:06.089193 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.271646 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.271713 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.311239 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.430822 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.430882 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:14:07 crc kubenswrapper[5116]: I0322 00:14:07.480645 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:14:08 crc kubenswrapper[5116]: I0322 00:14:08.094994 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wkk2b" Mar 22 00:14:08 crc kubenswrapper[5116]: I0322 00:14:08.115730 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v85n6" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.139351 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9"] Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.140997 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b94e50a-fe81-48fe-a23a-c15956c06d21" containerName="oc" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.141018 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b94e50a-fe81-48fe-a23a-c15956c06d21" containerName="oc" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.141207 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b94e50a-fe81-48fe-a23a-c15956c06d21" containerName="oc" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.154957 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9"] Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.155335 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.160123 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.160501 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.229524 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svcw2\" (UniqueName: \"kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.229603 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.229844 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.331144 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.331380 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.331479 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svcw2\" (UniqueName: \"kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.332463 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.343521 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.348579 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svcw2\" (UniqueName: \"kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2\") pod \"collect-profiles-29568975-txgb9\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.482281 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:00 crc kubenswrapper[5116]: I0322 00:15:00.892253 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9"] Mar 22 00:15:01 crc kubenswrapper[5116]: I0322 00:15:01.399580 5116 generic.go:358] "Generic (PLEG): container finished" podID="cd79fdbb-2811-4c9c-a80b-ad21ccc89560" containerID="fe8cd95451621929fe3a692f2551d46a8b780ef88806567692d43ff15530e87d" exitCode=0 Mar 22 00:15:01 crc kubenswrapper[5116]: I0322 00:15:01.399820 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" event={"ID":"cd79fdbb-2811-4c9c-a80b-ad21ccc89560","Type":"ContainerDied","Data":"fe8cd95451621929fe3a692f2551d46a8b780ef88806567692d43ff15530e87d"} Mar 22 00:15:01 crc kubenswrapper[5116]: I0322 00:15:01.399856 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" event={"ID":"cd79fdbb-2811-4c9c-a80b-ad21ccc89560","Type":"ContainerStarted","Data":"bc27613337a375500fc7577bd8bca5cde7fd28e3d075541abefdba644008eecc"} Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.620914 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.762474 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume\") pod \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.762566 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume\") pod \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.762663 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svcw2\" (UniqueName: \"kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2\") pod \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\" (UID: \"cd79fdbb-2811-4c9c-a80b-ad21ccc89560\") " Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.763405 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume" (OuterVolumeSpecName: "config-volume") pod "cd79fdbb-2811-4c9c-a80b-ad21ccc89560" (UID: "cd79fdbb-2811-4c9c-a80b-ad21ccc89560"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.767783 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cd79fdbb-2811-4c9c-a80b-ad21ccc89560" (UID: "cd79fdbb-2811-4c9c-a80b-ad21ccc89560"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.768002 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2" (OuterVolumeSpecName: "kube-api-access-svcw2") pod "cd79fdbb-2811-4c9c-a80b-ad21ccc89560" (UID: "cd79fdbb-2811-4c9c-a80b-ad21ccc89560"). InnerVolumeSpecName "kube-api-access-svcw2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.864614 5116 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.864665 5116 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-config-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:15:02 crc kubenswrapper[5116]: I0322 00:15:02.864680 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-svcw2\" (UniqueName: \"kubernetes.io/projected/cd79fdbb-2811-4c9c-a80b-ad21ccc89560-kube-api-access-svcw2\") on node \"crc\" DevicePath \"\"" Mar 22 00:15:03 crc kubenswrapper[5116]: I0322 00:15:03.414640 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" Mar 22 00:15:03 crc kubenswrapper[5116]: I0322 00:15:03.414665 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568975-txgb9" event={"ID":"cd79fdbb-2811-4c9c-a80b-ad21ccc89560","Type":"ContainerDied","Data":"bc27613337a375500fc7577bd8bca5cde7fd28e3d075541abefdba644008eecc"} Mar 22 00:15:03 crc kubenswrapper[5116]: I0322 00:15:03.414704 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc27613337a375500fc7577bd8bca5cde7fd28e3d075541abefdba644008eecc" Mar 22 00:15:23 crc kubenswrapper[5116]: I0322 00:15:23.057716 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:15:23 crc kubenswrapper[5116]: I0322 00:15:23.058423 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:15:53 crc kubenswrapper[5116]: I0322 00:15:53.057107 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:15:53 crc kubenswrapper[5116]: I0322 00:15:53.058047 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.139072 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568976-b4g9d"] Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.141007 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd79fdbb-2811-4c9c-a80b-ad21ccc89560" containerName="collect-profiles" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.141026 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd79fdbb-2811-4c9c-a80b-ad21ccc89560" containerName="collect-profiles" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.141499 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd79fdbb-2811-4c9c-a80b-ad21ccc89560" containerName="collect-profiles" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.153654 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.156531 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.157077 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.157596 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.157843 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568976-b4g9d"] Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.235902 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtcw\" (UniqueName: \"kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw\") pod \"auto-csr-approver-29568976-b4g9d\" (UID: \"832c911e-4692-4912-8df4-880e98e4c2c1\") " pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.337149 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtcw\" (UniqueName: \"kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw\") pod \"auto-csr-approver-29568976-b4g9d\" (UID: \"832c911e-4692-4912-8df4-880e98e4c2c1\") " pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.359019 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtcw\" (UniqueName: \"kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw\") pod \"auto-csr-approver-29568976-b4g9d\" (UID: \"832c911e-4692-4912-8df4-880e98e4c2c1\") " pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.477116 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.690403 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568976-b4g9d"] Mar 22 00:16:00 crc kubenswrapper[5116]: I0322 00:16:00.790456 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" event={"ID":"832c911e-4692-4912-8df4-880e98e4c2c1","Type":"ContainerStarted","Data":"d770bc2085ad293f09a4219b3201b776c0a1c8377f5e30755af76fa1942bd7c6"} Mar 22 00:16:02 crc kubenswrapper[5116]: I0322 00:16:02.807375 5116 generic.go:358] "Generic (PLEG): container finished" podID="832c911e-4692-4912-8df4-880e98e4c2c1" containerID="9cfe6ad0080f9bd011bb482561dcac74a7fc0e16adff6a8d4fce7c2e783aaf6b" exitCode=0 Mar 22 00:16:02 crc kubenswrapper[5116]: I0322 00:16:02.807501 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" event={"ID":"832c911e-4692-4912-8df4-880e98e4c2c1","Type":"ContainerDied","Data":"9cfe6ad0080f9bd011bb482561dcac74a7fc0e16adff6a8d4fce7c2e783aaf6b"} Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.091103 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.184992 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjtcw\" (UniqueName: \"kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw\") pod \"832c911e-4692-4912-8df4-880e98e4c2c1\" (UID: \"832c911e-4692-4912-8df4-880e98e4c2c1\") " Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.192463 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw" (OuterVolumeSpecName: "kube-api-access-gjtcw") pod "832c911e-4692-4912-8df4-880e98e4c2c1" (UID: "832c911e-4692-4912-8df4-880e98e4c2c1"). InnerVolumeSpecName "kube-api-access-gjtcw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.286940 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjtcw\" (UniqueName: \"kubernetes.io/projected/832c911e-4692-4912-8df4-880e98e4c2c1-kube-api-access-gjtcw\") on node \"crc\" DevicePath \"\"" Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.823685 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" event={"ID":"832c911e-4692-4912-8df4-880e98e4c2c1","Type":"ContainerDied","Data":"d770bc2085ad293f09a4219b3201b776c0a1c8377f5e30755af76fa1942bd7c6"} Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.823763 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d770bc2085ad293f09a4219b3201b776c0a1c8377f5e30755af76fa1942bd7c6" Mar 22 00:16:04 crc kubenswrapper[5116]: I0322 00:16:04.823872 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568976-b4g9d" Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.057069 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.057393 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.057437 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.057897 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.057960 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e" gracePeriod=600 Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.962625 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e" exitCode=0 Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.962713 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e"} Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.963055 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e"} Mar 22 00:16:23 crc kubenswrapper[5116]: I0322 00:16:23.963087 5116 scope.go:117] "RemoveContainer" containerID="a25cfaae8e3e082964edf4aaff4c07221b6f7fe72f8c4b2ecedb1fe877eab638" Mar 22 00:16:49 crc kubenswrapper[5116]: I0322 00:16:49.971644 5116 scope.go:117] "RemoveContainer" containerID="0eaa89fec503d9ea89bbf6737645ec15fbfea7b3c5aaafe399fe76d10fe522f5" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.137900 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568978-vkbll"] Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.139862 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="832c911e-4692-4912-8df4-880e98e4c2c1" containerName="oc" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.139892 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="832c911e-4692-4912-8df4-880e98e4c2c1" containerName="oc" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.140706 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="832c911e-4692-4912-8df4-880e98e4c2c1" containerName="oc" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.149360 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568978-vkbll"] Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.149552 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.154954 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.155543 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.155761 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.322334 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5md2\" (UniqueName: \"kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2\") pod \"auto-csr-approver-29568978-vkbll\" (UID: \"07ada1f6-f713-45cb-8230-b9a2d89878ab\") " pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.423412 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5md2\" (UniqueName: \"kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2\") pod \"auto-csr-approver-29568978-vkbll\" (UID: \"07ada1f6-f713-45cb-8230-b9a2d89878ab\") " pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.461827 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5md2\" (UniqueName: \"kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2\") pod \"auto-csr-approver-29568978-vkbll\" (UID: \"07ada1f6-f713-45cb-8230-b9a2d89878ab\") " pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.478603 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:00 crc kubenswrapper[5116]: I0322 00:18:00.670505 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568978-vkbll"] Mar 22 00:18:01 crc kubenswrapper[5116]: I0322 00:18:01.558898 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568978-vkbll" event={"ID":"07ada1f6-f713-45cb-8230-b9a2d89878ab","Type":"ContainerStarted","Data":"efda434b38a357545c0d29d28142fb26e6450f11a2171b5254593e5038046112"} Mar 22 00:18:02 crc kubenswrapper[5116]: I0322 00:18:02.566348 5116 generic.go:358] "Generic (PLEG): container finished" podID="07ada1f6-f713-45cb-8230-b9a2d89878ab" containerID="208d35041a700bcc47fefb636464fe18464c55b6addf5e55a5a1888e5fa3efb2" exitCode=0 Mar 22 00:18:02 crc kubenswrapper[5116]: I0322 00:18:02.566471 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568978-vkbll" event={"ID":"07ada1f6-f713-45cb-8230-b9a2d89878ab","Type":"ContainerDied","Data":"208d35041a700bcc47fefb636464fe18464c55b6addf5e55a5a1888e5fa3efb2"} Mar 22 00:18:03 crc kubenswrapper[5116]: I0322 00:18:03.811430 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:03 crc kubenswrapper[5116]: I0322 00:18:03.973625 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5md2\" (UniqueName: \"kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2\") pod \"07ada1f6-f713-45cb-8230-b9a2d89878ab\" (UID: \"07ada1f6-f713-45cb-8230-b9a2d89878ab\") " Mar 22 00:18:03 crc kubenswrapper[5116]: I0322 00:18:03.984569 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2" (OuterVolumeSpecName: "kube-api-access-d5md2") pod "07ada1f6-f713-45cb-8230-b9a2d89878ab" (UID: "07ada1f6-f713-45cb-8230-b9a2d89878ab"). InnerVolumeSpecName "kube-api-access-d5md2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.076327 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d5md2\" (UniqueName: \"kubernetes.io/projected/07ada1f6-f713-45cb-8230-b9a2d89878ab-kube-api-access-d5md2\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.581891 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568978-vkbll" event={"ID":"07ada1f6-f713-45cb-8230-b9a2d89878ab","Type":"ContainerDied","Data":"efda434b38a357545c0d29d28142fb26e6450f11a2171b5254593e5038046112"} Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.582287 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efda434b38a357545c0d29d28142fb26e6450f11a2171b5254593e5038046112" Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.581912 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568978-vkbll" Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.877396 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568972-5s86m"] Mar 22 00:18:04 crc kubenswrapper[5116]: I0322 00:18:04.882484 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568972-5s86m"] Mar 22 00:18:05 crc kubenswrapper[5116]: I0322 00:18:05.703643 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907ec022-a4e4-4d33-8329-52c9bbb71520" path="/var/lib/kubelet/pods/907ec022-a4e4-4d33-8329-52c9bbb71520/volumes" Mar 22 00:18:23 crc kubenswrapper[5116]: I0322 00:18:23.056770 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:18:23 crc kubenswrapper[5116]: I0322 00:18:23.057392 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:18:49 crc kubenswrapper[5116]: I0322 00:18:49.904888 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:18:49 crc kubenswrapper[5116]: I0322 00:18:49.915259 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:18:50 crc kubenswrapper[5116]: I0322 00:18:50.027583 5116 scope.go:117] "RemoveContainer" containerID="5f050f299176f9b417ea910e3bb8affec9c6d4bf35a6de76d0aa5ed0d88ddf0f" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.057401 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.057810 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.417591 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4"] Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.417932 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="kube-rbac-proxy" containerID="cri-o://6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.418081 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="ovnkube-cluster-manager" containerID="cri-o://ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.616380 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.629870 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n9zvq"] Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630499 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-controller" containerID="cri-o://75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630547 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-node" containerID="cri-o://75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630640 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-acl-logging" containerID="cri-o://03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630665 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630802 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="sbdb" containerID="cri-o://d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.630914 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="nbdb" containerID="cri-o://c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.632109 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="northd" containerID="cri-o://8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649054 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk"] Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649822 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07ada1f6-f713-45cb-8230-b9a2d89878ab" containerName="oc" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649847 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ada1f6-f713-45cb-8230-b9a2d89878ab" containerName="oc" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649858 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="ovnkube-cluster-manager" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649866 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="ovnkube-cluster-manager" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649906 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="kube-rbac-proxy" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.649915 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="kube-rbac-proxy" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.650046 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="ovnkube-cluster-manager" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.650062 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerName="kube-rbac-proxy" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.650071 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="07ada1f6-f713-45cb-8230-b9a2d89878ab" containerName="oc" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.653498 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.671853 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovnkube-controller" containerID="cri-o://b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" gracePeriod=30 Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.736777 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides\") pod \"e17ab744-68a7-4a24-8ef2-556696d752fb\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.736887 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert\") pod \"e17ab744-68a7-4a24-8ef2-556696d752fb\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.737025 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntdv4\" (UniqueName: \"kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4\") pod \"e17ab744-68a7-4a24-8ef2-556696d752fb\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.737056 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config\") pod \"e17ab744-68a7-4a24-8ef2-556696d752fb\" (UID: \"e17ab744-68a7-4a24-8ef2-556696d752fb\") " Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.738201 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e17ab744-68a7-4a24-8ef2-556696d752fb" (UID: "e17ab744-68a7-4a24-8ef2-556696d752fb"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.738845 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e17ab744-68a7-4a24-8ef2-556696d752fb" (UID: "e17ab744-68a7-4a24-8ef2-556696d752fb"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.752299 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "e17ab744-68a7-4a24-8ef2-556696d752fb" (UID: "e17ab744-68a7-4a24-8ef2-556696d752fb"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.752398 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4" (OuterVolumeSpecName: "kube-api-access-ntdv4") pod "e17ab744-68a7-4a24-8ef2-556696d752fb" (UID: "e17ab744-68a7-4a24-8ef2-556696d752fb"). InnerVolumeSpecName "kube-api-access-ntdv4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838446 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838520 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838550 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838622 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cd6p\" (UniqueName: \"kubernetes.io/projected/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-kube-api-access-8cd6p\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838742 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntdv4\" (UniqueName: \"kubernetes.io/projected/e17ab744-68a7-4a24-8ef2-556696d752fb-kube-api-access-ntdv4\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838758 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838769 5116 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e17ab744-68a7-4a24-8ef2-556696d752fb-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.838782 5116 reconciler_common.go:299] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e17ab744-68a7-4a24-8ef2-556696d752fb-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.939583 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.939645 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.939664 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.939701 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cd6p\" (UniqueName: \"kubernetes.io/projected/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-kube-api-access-8cd6p\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.940275 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-env-overrides\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.940443 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovnkube-config\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.944537 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.955981 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cd6p\" (UniqueName: \"kubernetes.io/projected/d9a45017-2c2a-4fa3-9277-4d1d8b674faf-kube-api-access-8cd6p\") pod \"ovnkube-control-plane-97c9b6c48-l9nvk\" (UID: \"d9a45017-2c2a-4fa3-9277-4d1d8b674faf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.967278 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n9zvq_ec484e57-1508-45a3-99a3-51dfa8ef6195/ovn-acl-logging/0.log" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.967755 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n9zvq_ec484e57-1508-45a3-99a3-51dfa8ef6195/ovn-controller/0.log" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.968236 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:18:53 crc kubenswrapper[5116]: I0322 00:18:53.989265 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.035619 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.038059 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.038233 5116 generic.go:358] "Generic (PLEG): container finished" podID="5188f25b-37c3-46f1-b939-199c6e082848" containerID="15ed776a73c12ffc79727de77156edc740c2234810a94e17ee8fc99d259db9c0" exitCode=2 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.038501 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9sq6c" event={"ID":"5188f25b-37c3-46f1-b939-199c6e082848","Type":"ContainerDied","Data":"15ed776a73c12ffc79727de77156edc740c2234810a94e17ee8fc99d259db9c0"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.039991 5116 scope.go:117] "RemoveContainer" containerID="15ed776a73c12ffc79727de77156edc740c2234810a94e17ee8fc99d259db9c0" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.040400 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rsw9b"] Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041411 5116 generic.go:358] "Generic (PLEG): container finished" podID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerID="ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041455 5116 generic.go:358] "Generic (PLEG): container finished" podID="e17ab744-68a7-4a24-8ef2-556696d752fb" containerID="6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041758 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-ovn-metrics" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041779 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-ovn-metrics" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041798 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="northd" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041808 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="northd" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041820 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041827 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041853 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041860 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="sbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041939 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="sbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041952 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-acl-logging" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041976 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-acl-logging" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041992 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="nbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.041998 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="nbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042017 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovnkube-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042022 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovnkube-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042034 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-node" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042066 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-node" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042088 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kubecfg-setup" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042095 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kubecfg-setup" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042349 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="northd" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042371 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-ovn-metrics" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042381 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042396 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovn-acl-logging" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042414 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="ovnkube-controller" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042430 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="sbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042444 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="kube-rbac-proxy-node" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.042453 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerName="nbdb" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.048706 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n9zvq_ec484e57-1508-45a3-99a3-51dfa8ef6195/ovn-acl-logging/0.log" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051101 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-n9zvq_ec484e57-1508-45a3-99a3-51dfa8ef6195/ovn-controller/0.log" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051902 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051935 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051948 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051958 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051968 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051977 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" exitCode=0 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.051986 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" exitCode=143 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.052000 5116 generic.go:358] "Generic (PLEG): container finished" podID="ec484e57-1508-45a3-99a3-51dfa8ef6195" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" exitCode=143 Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.052444 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059285 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerDied","Data":"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059427 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerDied","Data":"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059443 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4" event={"ID":"e17ab744-68a7-4a24-8ef2-556696d752fb","Type":"ContainerDied","Data":"7eadfb4600290cb56b95da12f03d4c885e0344117c4889ec529ea4aaac7dd7ce"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059457 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059474 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059486 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059499 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059513 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059525 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059540 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059556 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059562 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059569 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059575 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059581 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059587 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059594 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059589 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059626 5116 scope.go:117] "RemoveContainer" containerID="ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059600 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.059957 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060028 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060040 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060202 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060214 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060220 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060227 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060233 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060239 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060245 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060254 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060288 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060300 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060305 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060309 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060314 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060319 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060323 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060328 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060333 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060340 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-n9zvq" event={"ID":"ec484e57-1508-45a3-99a3-51dfa8ef6195","Type":"ContainerDied","Data":"43ef6e49a2566cb4645e9989226efbcb7eed21688ac9c6361a1e719ed88b50a9"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060371 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060377 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060381 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060386 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060393 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060398 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060402 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060407 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.060411 5116 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.082811 5116 scope.go:117] "RemoveContainer" containerID="6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.112490 5116 scope.go:117] "RemoveContainer" containerID="ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.113102 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\": container with ID starting with ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8 not found: ID does not exist" containerID="ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.113160 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8"} err="failed to get container status \"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\": rpc error: code = NotFound desc = could not find container \"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\": container with ID starting with ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.113211 5116 scope.go:117] "RemoveContainer" containerID="6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.113759 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\": container with ID starting with 6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7 not found: ID does not exist" containerID="6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.113800 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7"} err="failed to get container status \"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\": rpc error: code = NotFound desc = could not find container \"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\": container with ID starting with 6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.113823 5116 scope.go:117] "RemoveContainer" containerID="ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.114365 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8"} err="failed to get container status \"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\": rpc error: code = NotFound desc = could not find container \"ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8\": container with ID starting with ce075de500f205b6f182bf14de1c76d00d30e4bf81c236cb0b79070c70f68cb8 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.114402 5116 scope.go:117] "RemoveContainer" containerID="6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.116095 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4"] Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.116433 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7"} err="failed to get container status \"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\": rpc error: code = NotFound desc = could not find container \"6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7\": container with ID starting with 6e97dd4553f4498ba6a3adb833f5b2ed8e4889c77a486028365a7f9b9cd964b7 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.116462 5116 scope.go:117] "RemoveContainer" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.121467 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-57b78d8988-bd7p4"] Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.142982 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143030 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qp9r\" (UniqueName: \"kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143057 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143087 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143102 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143129 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143157 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143252 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143281 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143313 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143352 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143375 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143394 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143408 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143449 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143461 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143493 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143517 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143561 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143620 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn\") pod \"ec484e57-1508-45a3-99a3-51dfa8ef6195\" (UID: \"ec484e57-1508-45a3-99a3-51dfa8ef6195\") " Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143810 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log" (OuterVolumeSpecName: "node-log") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143979 5116 reconciler_common.go:299] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-node-log\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.143987 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144062 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144096 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144113 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144123 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144154 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144193 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144226 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144249 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144639 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket" (OuterVolumeSpecName: "log-socket") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144683 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144711 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash" (OuterVolumeSpecName: "host-slash") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.144836 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.145240 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.145262 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.145287 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.146242 5116 scope.go:117] "RemoveContainer" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.148430 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.148891 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r" (OuterVolumeSpecName: "kube-api-access-8qp9r") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "kube-api-access-8qp9r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.158316 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ec484e57-1508-45a3-99a3-51dfa8ef6195" (UID: "ec484e57-1508-45a3-99a3-51dfa8ef6195"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.173283 5116 scope.go:117] "RemoveContainer" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.190248 5116 scope.go:117] "RemoveContainer" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.208640 5116 scope.go:117] "RemoveContainer" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.234800 5116 scope.go:117] "RemoveContainer" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244851 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-systemd-units\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244899 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-env-overrides\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244924 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-ovn\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244946 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-etc-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244974 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.244997 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-var-lib-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245019 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovn-node-metrics-cert\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245045 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245067 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-kubelet\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245089 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-netns\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245107 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-systemd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245144 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-netd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245160 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-node-log\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245200 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-slash\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245223 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zx6j\" (UniqueName: \"kubernetes.io/projected/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-kube-api-access-2zx6j\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245264 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245284 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-script-lib\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245310 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-bin\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245333 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-log-socket\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245358 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-config\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245423 5116 reconciler_common.go:299] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245434 5116 reconciler_common.go:299] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-log-socket\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245444 5116 reconciler_common.go:299] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-slash\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245455 5116 reconciler_common.go:299] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245465 5116 reconciler_common.go:299] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245476 5116 reconciler_common.go:299] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245487 5116 reconciler_common.go:299] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245497 5116 reconciler_common.go:299] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245508 5116 reconciler_common.go:299] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245519 5116 reconciler_common.go:299] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245529 5116 reconciler_common.go:299] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245539 5116 reconciler_common.go:299] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245548 5116 reconciler_common.go:299] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245558 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qp9r\" (UniqueName: \"kubernetes.io/projected/ec484e57-1508-45a3-99a3-51dfa8ef6195-kube-api-access-8qp9r\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.245568 5116 reconciler_common.go:299] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.246863 5116 reconciler_common.go:299] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.247158 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.247188 5116 reconciler_common.go:299] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec484e57-1508-45a3-99a3-51dfa8ef6195-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.247200 5116 reconciler_common.go:299] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec484e57-1508-45a3-99a3-51dfa8ef6195-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.254053 5116 scope.go:117] "RemoveContainer" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.270396 5116 scope.go:117] "RemoveContainer" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.301387 5116 scope.go:117] "RemoveContainer" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.317063 5116 scope.go:117] "RemoveContainer" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.317519 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": container with ID starting with b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740 not found: ID does not exist" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.317552 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} err="failed to get container status \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": rpc error: code = NotFound desc = could not find container \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": container with ID starting with b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.317579 5116 scope.go:117] "RemoveContainer" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.317793 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": container with ID starting with d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325 not found: ID does not exist" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.317821 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} err="failed to get container status \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": rpc error: code = NotFound desc = could not find container \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": container with ID starting with d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.317840 5116 scope.go:117] "RemoveContainer" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.318241 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": container with ID starting with c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85 not found: ID does not exist" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.318308 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} err="failed to get container status \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": rpc error: code = NotFound desc = could not find container \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": container with ID starting with c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.318356 5116 scope.go:117] "RemoveContainer" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.318813 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": container with ID starting with 8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f not found: ID does not exist" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.318847 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} err="failed to get container status \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": rpc error: code = NotFound desc = could not find container \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": container with ID starting with 8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.318868 5116 scope.go:117] "RemoveContainer" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.319217 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": container with ID starting with 26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62 not found: ID does not exist" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.319250 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} err="failed to get container status \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": rpc error: code = NotFound desc = could not find container \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": container with ID starting with 26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.319268 5116 scope.go:117] "RemoveContainer" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.319547 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": container with ID starting with 75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93 not found: ID does not exist" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.319593 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} err="failed to get container status \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": rpc error: code = NotFound desc = could not find container \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": container with ID starting with 75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.319610 5116 scope.go:117] "RemoveContainer" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.320012 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": container with ID starting with 03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519 not found: ID does not exist" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320033 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} err="failed to get container status \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": rpc error: code = NotFound desc = could not find container \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": container with ID starting with 03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320047 5116 scope.go:117] "RemoveContainer" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.320287 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": container with ID starting with 75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d not found: ID does not exist" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320313 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} err="failed to get container status \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": rpc error: code = NotFound desc = could not find container \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": container with ID starting with 75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320325 5116 scope.go:117] "RemoveContainer" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: E0322 00:18:54.320732 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": container with ID starting with f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375 not found: ID does not exist" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320751 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} err="failed to get container status \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": rpc error: code = NotFound desc = could not find container \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": container with ID starting with f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.320764 5116 scope.go:117] "RemoveContainer" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321052 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} err="failed to get container status \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": rpc error: code = NotFound desc = could not find container \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": container with ID starting with b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321096 5116 scope.go:117] "RemoveContainer" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321434 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} err="failed to get container status \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": rpc error: code = NotFound desc = could not find container \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": container with ID starting with d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321453 5116 scope.go:117] "RemoveContainer" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321707 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} err="failed to get container status \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": rpc error: code = NotFound desc = could not find container \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": container with ID starting with c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321726 5116 scope.go:117] "RemoveContainer" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321916 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} err="failed to get container status \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": rpc error: code = NotFound desc = could not find container \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": container with ID starting with 8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.321934 5116 scope.go:117] "RemoveContainer" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322195 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} err="failed to get container status \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": rpc error: code = NotFound desc = could not find container \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": container with ID starting with 26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322214 5116 scope.go:117] "RemoveContainer" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322369 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} err="failed to get container status \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": rpc error: code = NotFound desc = could not find container \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": container with ID starting with 75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322387 5116 scope.go:117] "RemoveContainer" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322603 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} err="failed to get container status \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": rpc error: code = NotFound desc = could not find container \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": container with ID starting with 03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322647 5116 scope.go:117] "RemoveContainer" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322829 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} err="failed to get container status \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": rpc error: code = NotFound desc = could not find container \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": container with ID starting with 75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.322847 5116 scope.go:117] "RemoveContainer" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323014 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} err="failed to get container status \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": rpc error: code = NotFound desc = could not find container \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": container with ID starting with f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323037 5116 scope.go:117] "RemoveContainer" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323320 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} err="failed to get container status \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": rpc error: code = NotFound desc = could not find container \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": container with ID starting with b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323338 5116 scope.go:117] "RemoveContainer" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323686 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} err="failed to get container status \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": rpc error: code = NotFound desc = could not find container \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": container with ID starting with d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323711 5116 scope.go:117] "RemoveContainer" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323901 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} err="failed to get container status \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": rpc error: code = NotFound desc = could not find container \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": container with ID starting with c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.323921 5116 scope.go:117] "RemoveContainer" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324098 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} err="failed to get container status \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": rpc error: code = NotFound desc = could not find container \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": container with ID starting with 8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324118 5116 scope.go:117] "RemoveContainer" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324394 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} err="failed to get container status \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": rpc error: code = NotFound desc = could not find container \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": container with ID starting with 26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324418 5116 scope.go:117] "RemoveContainer" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324612 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} err="failed to get container status \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": rpc error: code = NotFound desc = could not find container \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": container with ID starting with 75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324632 5116 scope.go:117] "RemoveContainer" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324882 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} err="failed to get container status \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": rpc error: code = NotFound desc = could not find container \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": container with ID starting with 03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.324944 5116 scope.go:117] "RemoveContainer" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325192 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} err="failed to get container status \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": rpc error: code = NotFound desc = could not find container \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": container with ID starting with 75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325213 5116 scope.go:117] "RemoveContainer" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325383 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} err="failed to get container status \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": rpc error: code = NotFound desc = could not find container \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": container with ID starting with f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325403 5116 scope.go:117] "RemoveContainer" containerID="b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325941 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740"} err="failed to get container status \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": rpc error: code = NotFound desc = could not find container \"b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740\": container with ID starting with b80112059c769680231d0571abc4d8d755f8c253db1cbe13d43ac0e2e54d3740 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.325976 5116 scope.go:117] "RemoveContainer" containerID="d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326207 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325"} err="failed to get container status \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": rpc error: code = NotFound desc = could not find container \"d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325\": container with ID starting with d42252499ce156a36b988490d1ac4d16730667b54df95511b2505faa913d1325 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326228 5116 scope.go:117] "RemoveContainer" containerID="c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326409 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85"} err="failed to get container status \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": rpc error: code = NotFound desc = could not find container \"c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85\": container with ID starting with c15c776411e89d3f2442f0bf335f55967d54b86130e301f1a4c9277c6ef0ff85 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326428 5116 scope.go:117] "RemoveContainer" containerID="8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326669 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f"} err="failed to get container status \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": rpc error: code = NotFound desc = could not find container \"8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f\": container with ID starting with 8a009203926cb93d9546e248fb0b1264b618f7701568ea87799b19b2a79d575f not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326686 5116 scope.go:117] "RemoveContainer" containerID="26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326846 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62"} err="failed to get container status \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": rpc error: code = NotFound desc = could not find container \"26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62\": container with ID starting with 26be9a5d824b7d641b4e5950c4589dd770e0471e6cd07e6d0f51fe0875d2eb62 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.326864 5116 scope.go:117] "RemoveContainer" containerID="75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327053 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93"} err="failed to get container status \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": rpc error: code = NotFound desc = could not find container \"75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93\": container with ID starting with 75f5b296238e0fd9165a2cb9c8d5f149e870fcfba1841a8a894684d066874a93 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327074 5116 scope.go:117] "RemoveContainer" containerID="03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327258 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519"} err="failed to get container status \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": rpc error: code = NotFound desc = could not find container \"03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519\": container with ID starting with 03d2a6ff7f92619bea02bc0e0497968197381987522d9adacd7f4541174be519 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327281 5116 scope.go:117] "RemoveContainer" containerID="75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327475 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d"} err="failed to get container status \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": rpc error: code = NotFound desc = could not find container \"75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d\": container with ID starting with 75fde8ce000dc6d718c1170a4ca389495d28a7d6af1e61f5509c2c27a542fa9d not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327492 5116 scope.go:117] "RemoveContainer" containerID="f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.327659 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375"} err="failed to get container status \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": rpc error: code = NotFound desc = could not find container \"f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375\": container with ID starting with f9c41256090ebd9f997ea75e5802bb847042020b1427b5ed0744e5ab0389d375 not found: ID does not exist" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349038 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-env-overrides\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349102 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-ovn\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349124 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-etc-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349149 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349278 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-ovn\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349331 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.349398 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-etc-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350087 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-env-overrides\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350160 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-var-lib-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350303 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovn-node-metrics-cert\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350324 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-var-lib-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350347 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350419 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-kubelet\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350473 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-netns\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350493 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-systemd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350593 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-netd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350604 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-kubelet\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350635 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-node-log\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350671 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-systemd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350673 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-netd\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350635 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-netns\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350727 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-node-log\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350775 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-slash\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350832 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zx6j\" (UniqueName: \"kubernetes.io/projected/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-kube-api-access-2zx6j\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350878 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-slash\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350985 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-run-openvswitch\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351022 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.350992 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-run-ovn-kubernetes\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351057 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-script-lib\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351098 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-bin\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351139 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-log-socket\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351237 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-config\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351305 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-systemd-units\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351320 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-host-cni-bin\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351352 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-systemd-units\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351475 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-log-socket\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351945 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-script-lib\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.351963 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovnkube-config\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.367035 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-ovn-node-metrics-cert\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.369783 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zx6j\" (UniqueName: \"kubernetes.io/projected/16fdd440-8f3d-4ab4-a86e-85bf7c1ee883-kube-api-access-2zx6j\") pod \"ovnkube-node-rsw9b\" (UID: \"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883\") " pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.386044 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.391279 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n9zvq"] Mar 22 00:18:54 crc kubenswrapper[5116]: I0322 00:18:54.400282 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-n9zvq"] Mar 22 00:18:54 crc kubenswrapper[5116]: W0322 00:18:54.423268 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fdd440_8f3d_4ab4_a86e_85bf7c1ee883.slice/crio-7bf7458e833fbff321d3d4569c27d7fc82d52e776448bb3c7fe681018ed9a3eb WatchSource:0}: Error finding container 7bf7458e833fbff321d3d4569c27d7fc82d52e776448bb3c7fe681018ed9a3eb: Status 404 returned error can't find the container with id 7bf7458e833fbff321d3d4569c27d7fc82d52e776448bb3c7fe681018ed9a3eb Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.063118 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.063299 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9sq6c" event={"ID":"5188f25b-37c3-46f1-b939-199c6e082848","Type":"ContainerStarted","Data":"4403801b89ed99b6ab9a49e3e33da08a8d566065fa2e776b22226d031e88abf8"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.068643 5116 generic.go:358] "Generic (PLEG): container finished" podID="16fdd440-8f3d-4ab4-a86e-85bf7c1ee883" containerID="9637c7982f5541f4863e209ccefbe5a3de02eb6f01f5fd44a7c7428079f51db3" exitCode=0 Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.068720 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerDied","Data":"9637c7982f5541f4863e209ccefbe5a3de02eb6f01f5fd44a7c7428079f51db3"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.068742 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"7bf7458e833fbff321d3d4569c27d7fc82d52e776448bb3c7fe681018ed9a3eb"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.074077 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" event={"ID":"d9a45017-2c2a-4fa3-9277-4d1d8b674faf","Type":"ContainerStarted","Data":"26e2afe933d986e2c7a725e228aeba5907387b34ecaec01a61cfa250ee737719"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.074117 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" event={"ID":"d9a45017-2c2a-4fa3-9277-4d1d8b674faf","Type":"ContainerStarted","Data":"6a45e13aa9c81e20728a9bc8d289081021e3ea28336dc4a228b385a1765b48f0"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.074133 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" event={"ID":"d9a45017-2c2a-4fa3-9277-4d1d8b674faf","Type":"ContainerStarted","Data":"c6bd055f4ea29e0ea189cc3bc424ed69095f061fb751917ab55890b51db96f8e"} Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.098105 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-97c9b6c48-l9nvk" podStartSLOduration=2.098088355 podStartE2EDuration="2.098088355s" podCreationTimestamp="2026-03-22 00:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:18:55.094419711 +0000 UTC m=+606.116721084" watchObservedRunningTime="2026-03-22 00:18:55.098088355 +0000 UTC m=+606.120389728" Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.705605 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17ab744-68a7-4a24-8ef2-556696d752fb" path="/var/lib/kubelet/pods/e17ab744-68a7-4a24-8ef2-556696d752fb/volumes" Mar 22 00:18:55 crc kubenswrapper[5116]: I0322 00:18:55.706886 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec484e57-1508-45a3-99a3-51dfa8ef6195" path="/var/lib/kubelet/pods/ec484e57-1508-45a3-99a3-51dfa8ef6195/volumes" Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085641 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"8ca5e805f39da4e140c396a8f79429ea198e0b1151f50173e35116ca3b605426"} Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085685 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"e340de5046b550673ff3fed848836b28e00dde668e71925c8644ca64ff2d85c4"} Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085695 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"661756b1ac6636342786b79269815da09abc6fe26ff33701897a5b1bfff92e8c"} Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085704 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"315b8e7fb1607aa4c1b643495f6de8b806b06588a5eb7e11d996ddc9e320d085"} Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085714 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"f5697292eeb4e45b09fba5203605fa49faae7deb1295a75e0b8b92e9ddf4ee32"} Mar 22 00:18:56 crc kubenswrapper[5116]: I0322 00:18:56.085725 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"577160a3e8d068db15afd3a913144caa27408dcca13cda44c8a9e6fe4896a096"} Mar 22 00:18:58 crc kubenswrapper[5116]: I0322 00:18:58.098557 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"173b0b1f3d2caca659e6628f85cb4c9d689999d6a6ffb57b65d5fe3d25ced049"} Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.121160 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" event={"ID":"16fdd440-8f3d-4ab4-a86e-85bf7c1ee883","Type":"ContainerStarted","Data":"070bfc5b096e07b982c3085b0f52711e8f80913d668dfd5ae5d7a9d5a2ec3fc7"} Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.121780 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.121795 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.121804 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.148328 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.149699 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:01 crc kubenswrapper[5116]: I0322 00:19:01.155962 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" podStartSLOduration=7.155943691 podStartE2EDuration="7.155943691s" podCreationTimestamp="2026-03-22 00:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:19:01.153316763 +0000 UTC m=+612.175618146" watchObservedRunningTime="2026-03-22 00:19:01.155943691 +0000 UTC m=+612.178245064" Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.057299 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.058054 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.058113 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.058739 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.058813 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e" gracePeriod=600 Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.255100 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e" exitCode=0 Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.255196 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e"} Mar 22 00:19:23 crc kubenswrapper[5116]: I0322 00:19:23.255251 5116 scope.go:117] "RemoveContainer" containerID="b51ee51711afcb946b670f0d81e2401692bf98b75a88b05923f26883062fdb6e" Mar 22 00:19:24 crc kubenswrapper[5116]: I0322 00:19:24.267826 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2"} Mar 22 00:19:33 crc kubenswrapper[5116]: I0322 00:19:33.171623 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rsw9b" Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.496078 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.496946 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m2vjz" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="registry-server" containerID="cri-o://1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494" gracePeriod=30 Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.864025 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.939415 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities\") pod \"4f62de57-b304-469a-ab77-b6796a6a482c\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.939634 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwxf9\" (UniqueName: \"kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9\") pod \"4f62de57-b304-469a-ab77-b6796a6a482c\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.939699 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content\") pod \"4f62de57-b304-469a-ab77-b6796a6a482c\" (UID: \"4f62de57-b304-469a-ab77-b6796a6a482c\") " Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.941427 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities" (OuterVolumeSpecName: "utilities") pod "4f62de57-b304-469a-ab77-b6796a6a482c" (UID: "4f62de57-b304-469a-ab77-b6796a6a482c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.945731 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9" (OuterVolumeSpecName: "kube-api-access-xwxf9") pod "4f62de57-b304-469a-ab77-b6796a6a482c" (UID: "4f62de57-b304-469a-ab77-b6796a6a482c"). InnerVolumeSpecName "kube-api-access-xwxf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:19:58 crc kubenswrapper[5116]: I0322 00:19:58.964082 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f62de57-b304-469a-ab77-b6796a6a482c" (UID: "4f62de57-b304-469a-ab77-b6796a6a482c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.042311 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwxf9\" (UniqueName: \"kubernetes.io/projected/4f62de57-b304-469a-ab77-b6796a6a482c-kube-api-access-xwxf9\") on node \"crc\" DevicePath \"\"" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.042366 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.042384 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62de57-b304-469a-ab77-b6796a6a482c-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.495675 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-5kxj9"] Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496491 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="extract-utilities" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496518 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="extract-utilities" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496539 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="extract-content" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496547 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="extract-content" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496560 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="registry-server" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496569 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="registry-server" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.496691 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" containerName="registry-server" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.503712 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.504454 5116 generic.go:358] "Generic (PLEG): container finished" podID="4f62de57-b304-469a-ab77-b6796a6a482c" containerID="1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494" exitCode=0 Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.504520 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m2vjz" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.504555 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerDied","Data":"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494"} Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.504597 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m2vjz" event={"ID":"4f62de57-b304-469a-ab77-b6796a6a482c","Type":"ContainerDied","Data":"2bfb9b2cbc2bf9b208015b497f6aa3a527695246821234f87810cec830fcdda1"} Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.504619 5116 scope.go:117] "RemoveContainer" containerID="1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.508623 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-5kxj9"] Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.524001 5116 scope.go:117] "RemoveContainer" containerID="6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.546412 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.556557 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m2vjz"] Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.557400 5116 scope.go:117] "RemoveContainer" containerID="27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.575307 5116 scope.go:117] "RemoveContainer" containerID="1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494" Mar 22 00:19:59 crc kubenswrapper[5116]: E0322 00:19:59.576059 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494\": container with ID starting with 1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494 not found: ID does not exist" containerID="1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.576100 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494"} err="failed to get container status \"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494\": rpc error: code = NotFound desc = could not find container \"1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494\": container with ID starting with 1d937792336588d7f4a3fd335a18f87c926bcb88e8eceb369806c30a1f306494 not found: ID does not exist" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.576121 5116 scope.go:117] "RemoveContainer" containerID="6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155" Mar 22 00:19:59 crc kubenswrapper[5116]: E0322 00:19:59.576485 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155\": container with ID starting with 6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155 not found: ID does not exist" containerID="6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.576528 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155"} err="failed to get container status \"6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155\": rpc error: code = NotFound desc = could not find container \"6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155\": container with ID starting with 6173164482aab8a51a8f73c58ede47ca239c1ae1606dc490f2a8bb20ad689155 not found: ID does not exist" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.576554 5116 scope.go:117] "RemoveContainer" containerID="27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e" Mar 22 00:19:59 crc kubenswrapper[5116]: E0322 00:19:59.577055 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e\": container with ID starting with 27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e not found: ID does not exist" containerID="27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.577088 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e"} err="failed to get container status \"27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e\": rpc error: code = NotFound desc = could not find container \"27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e\": container with ID starting with 27e9af3227635826bdc0e57820bf83bfd970557bab506c2c6b63a6bb5dd29c2e not found: ID does not exist" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653276 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-bound-sa-token\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653330 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-tls\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653380 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85d2e292-d36f-4bca-82b5-0a2770f13848-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653408 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85d2e292-d36f-4bca-82b5-0a2770f13848-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653442 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653545 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-trusted-ca\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653631 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6wl8\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-kube-api-access-z6wl8\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.653719 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-certificates\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.673238 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.704388 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f62de57-b304-469a-ab77-b6796a6a482c" path="/var/lib/kubelet/pods/4f62de57-b304-469a-ab77-b6796a6a482c/volumes" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.754917 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-certificates\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755002 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-bound-sa-token\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755201 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-tls\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755442 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85d2e292-d36f-4bca-82b5-0a2770f13848-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755517 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85d2e292-d36f-4bca-82b5-0a2770f13848-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755673 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-trusted-ca\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755747 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6wl8\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-kube-api-access-z6wl8\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.755962 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/85d2e292-d36f-4bca-82b5-0a2770f13848-ca-trust-extracted\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.756766 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-certificates\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.756776 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85d2e292-d36f-4bca-82b5-0a2770f13848-trusted-ca\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.760640 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-registry-tls\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.761339 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/85d2e292-d36f-4bca-82b5-0a2770f13848-installation-pull-secrets\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.773260 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-bound-sa-token\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.778881 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6wl8\" (UniqueName: \"kubernetes.io/projected/85d2e292-d36f-4bca-82b5-0a2770f13848-kube-api-access-z6wl8\") pod \"image-registry-5d9d95bf5b-5kxj9\" (UID: \"85d2e292-d36f-4bca-82b5-0a2770f13848\") " pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:19:59 crc kubenswrapper[5116]: I0322 00:19:59.827443 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.141383 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568980-ksbk2"] Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.152770 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568980-ksbk2"] Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.152865 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.155698 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.156487 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.160757 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.264261 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fld7p\" (UniqueName: \"kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p\") pod \"auto-csr-approver-29568980-ksbk2\" (UID: \"11840734-dc87-4532-b341-aeb889f011c4\") " pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.281228 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5d9d95bf5b-5kxj9"] Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.365884 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fld7p\" (UniqueName: \"kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p\") pod \"auto-csr-approver-29568980-ksbk2\" (UID: \"11840734-dc87-4532-b341-aeb889f011c4\") " pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.385283 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fld7p\" (UniqueName: \"kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p\") pod \"auto-csr-approver-29568980-ksbk2\" (UID: \"11840734-dc87-4532-b341-aeb889f011c4\") " pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.470405 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.519265 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" event={"ID":"85d2e292-d36f-4bca-82b5-0a2770f13848","Type":"ContainerStarted","Data":"51c2db15e29179571dd629b15fa52011ea2ebfbdc5e7921b02b67557958ca769"} Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.519341 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" event={"ID":"85d2e292-d36f-4bca-82b5-0a2770f13848","Type":"ContainerStarted","Data":"99744a56eaf9459785159148e99774d3cd556e1360a660b8b2e0de57813e055e"} Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.520245 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.551705 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" podStartSLOduration=1.551689778 podStartE2EDuration="1.551689778s" podCreationTimestamp="2026-03-22 00:19:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:20:00.546541362 +0000 UTC m=+671.568842735" watchObservedRunningTime="2026-03-22 00:20:00.551689778 +0000 UTC m=+671.573991151" Mar 22 00:20:00 crc kubenswrapper[5116]: I0322 00:20:00.651974 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568980-ksbk2"] Mar 22 00:20:01 crc kubenswrapper[5116]: I0322 00:20:01.524930 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" event={"ID":"11840734-dc87-4532-b341-aeb889f011c4","Type":"ContainerStarted","Data":"c361e0eea64867e1703b78fe98851ab9c0ff404fb999e8f69b6d6334202ab1c4"} Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.185119 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48"] Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.190242 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.191900 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2ccr\"" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.234130 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48"] Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.291551 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whz7m\" (UniqueName: \"kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.291639 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.291674 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.393260 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whz7m\" (UniqueName: \"kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.393333 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.393371 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.394121 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.394794 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.417755 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whz7m\" (UniqueName: \"kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: E0322 00:20:02.459154 5116 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11840734_dc87_4532_b341_aeb889f011c4.slice/crio-conmon-8af1cbb249920bcf15fdbe5b5c0f46798a3269524c401605be7ddebc1e8b62d8.scope\": RecentStats: unable to find data in memory cache]" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.514658 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.533828 5116 generic.go:358] "Generic (PLEG): container finished" podID="11840734-dc87-4532-b341-aeb889f011c4" containerID="8af1cbb249920bcf15fdbe5b5c0f46798a3269524c401605be7ddebc1e8b62d8" exitCode=0 Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.534051 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" event={"ID":"11840734-dc87-4532-b341-aeb889f011c4","Type":"ContainerDied","Data":"8af1cbb249920bcf15fdbe5b5c0f46798a3269524c401605be7ddebc1e8b62d8"} Mar 22 00:20:02 crc kubenswrapper[5116]: I0322 00:20:02.736243 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48"] Mar 22 00:20:02 crc kubenswrapper[5116]: W0322 00:20:02.751030 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd47ba40_7b8e_4f2c_8e16_62a5f085def8.slice/crio-87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2 WatchSource:0}: Error finding container 87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2: Status 404 returned error can't find the container with id 87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2 Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.541946 5116 generic.go:358] "Generic (PLEG): container finished" podID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerID="44d25f50e90508d11aa6151893869dc818c08d23f4ec081d18453e99cc999cb4" exitCode=0 Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.542011 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" event={"ID":"dd47ba40-7b8e-4f2c-8e16-62a5f085def8","Type":"ContainerDied","Data":"44d25f50e90508d11aa6151893869dc818c08d23f4ec081d18453e99cc999cb4"} Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.542432 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" event={"ID":"dd47ba40-7b8e-4f2c-8e16-62a5f085def8","Type":"ContainerStarted","Data":"87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2"} Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.786998 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.915750 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fld7p\" (UniqueName: \"kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p\") pod \"11840734-dc87-4532-b341-aeb889f011c4\" (UID: \"11840734-dc87-4532-b341-aeb889f011c4\") " Mar 22 00:20:03 crc kubenswrapper[5116]: I0322 00:20:03.925491 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p" (OuterVolumeSpecName: "kube-api-access-fld7p") pod "11840734-dc87-4532-b341-aeb889f011c4" (UID: "11840734-dc87-4532-b341-aeb889f011c4"). InnerVolumeSpecName "kube-api-access-fld7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.017383 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fld7p\" (UniqueName: \"kubernetes.io/projected/11840734-dc87-4532-b341-aeb889f011c4-kube-api-access-fld7p\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.553719 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.553708 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568980-ksbk2" event={"ID":"11840734-dc87-4532-b341-aeb889f011c4","Type":"ContainerDied","Data":"c361e0eea64867e1703b78fe98851ab9c0ff404fb999e8f69b6d6334202ab1c4"} Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.554319 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c361e0eea64867e1703b78fe98851ab9c0ff404fb999e8f69b6d6334202ab1c4" Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.847794 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568974-w8j5j"] Mar 22 00:20:04 crc kubenswrapper[5116]: I0322 00:20:04.852050 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568974-w8j5j"] Mar 22 00:20:05 crc kubenswrapper[5116]: I0322 00:20:05.562064 5116 generic.go:358] "Generic (PLEG): container finished" podID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerID="ec351302955a52d95c88ee77bc5d5bfac4e35eb69fcb5e061e7f64cb8d6956b7" exitCode=0 Mar 22 00:20:05 crc kubenswrapper[5116]: I0322 00:20:05.562111 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" event={"ID":"dd47ba40-7b8e-4f2c-8e16-62a5f085def8","Type":"ContainerDied","Data":"ec351302955a52d95c88ee77bc5d5bfac4e35eb69fcb5e061e7f64cb8d6956b7"} Mar 22 00:20:05 crc kubenswrapper[5116]: I0322 00:20:05.709073 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b94e50a-fe81-48fe-a23a-c15956c06d21" path="/var/lib/kubelet/pods/3b94e50a-fe81-48fe-a23a-c15956c06d21/volumes" Mar 22 00:20:06 crc kubenswrapper[5116]: I0322 00:20:06.573475 5116 generic.go:358] "Generic (PLEG): container finished" podID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerID="09c8ce3bf4207242310e44b1cfbd97c2b09a09c2b340457bb558675ee6f5519e" exitCode=0 Mar 22 00:20:06 crc kubenswrapper[5116]: I0322 00:20:06.573605 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" event={"ID":"dd47ba40-7b8e-4f2c-8e16-62a5f085def8","Type":"ContainerDied","Data":"09c8ce3bf4207242310e44b1cfbd97c2b09a09c2b340457bb558675ee6f5519e"} Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.792541 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.876693 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util\") pod \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.876807 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle\") pod \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.876848 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whz7m\" (UniqueName: \"kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m\") pod \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\" (UID: \"dd47ba40-7b8e-4f2c-8e16-62a5f085def8\") " Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.880118 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle" (OuterVolumeSpecName: "bundle") pod "dd47ba40-7b8e-4f2c-8e16-62a5f085def8" (UID: "dd47ba40-7b8e-4f2c-8e16-62a5f085def8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.885113 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m" (OuterVolumeSpecName: "kube-api-access-whz7m") pod "dd47ba40-7b8e-4f2c-8e16-62a5f085def8" (UID: "dd47ba40-7b8e-4f2c-8e16-62a5f085def8"). InnerVolumeSpecName "kube-api-access-whz7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.888883 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util" (OuterVolumeSpecName: "util") pod "dd47ba40-7b8e-4f2c-8e16-62a5f085def8" (UID: "dd47ba40-7b8e-4f2c-8e16-62a5f085def8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.978585 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.978936 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:07 crc kubenswrapper[5116]: I0322 00:20:07.978954 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-whz7m\" (UniqueName: \"kubernetes.io/projected/dd47ba40-7b8e-4f2c-8e16-62a5f085def8-kube-api-access-whz7m\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.406301 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn"] Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407227 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11840734-dc87-4532-b341-aeb889f011c4" containerName="oc" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407260 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="11840734-dc87-4532-b341-aeb889f011c4" containerName="oc" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407286 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="util" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407296 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="util" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407316 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="extract" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407329 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="extract" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407370 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="pull" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407379 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="pull" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407524 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="11840734-dc87-4532-b341-aeb889f011c4" containerName="oc" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.407546 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd47ba40-7b8e-4f2c-8e16-62a5f085def8" containerName="extract" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.430199 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn"] Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.430551 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.587841 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.587996 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xzzn\" (UniqueName: \"kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.588062 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.592644 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" event={"ID":"dd47ba40-7b8e-4f2c-8e16-62a5f085def8","Type":"ContainerDied","Data":"87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2"} Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.592906 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87b7eae02562b9b861197a3a98f6e647d0bf6b7eeb198e19eb3ec8c7323275d2" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.592825 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.689939 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xzzn\" (UniqueName: \"kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.690288 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.690443 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.691369 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.691475 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.712320 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xzzn\" (UniqueName: \"kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.754937 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:08 crc kubenswrapper[5116]: I0322 00:20:08.949674 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn"] Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.200751 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr"] Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.210266 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.216188 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr"] Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.402061 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.402229 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:09 crc kubenswrapper[5116]: I0322 00:20:09.402278 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk68q\" (UniqueName: \"kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.881996 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.882085 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.882110 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk68q\" (UniqueName: \"kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.887541 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.888756 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.925396 5116 generic.go:358] "Generic (PLEG): container finished" podID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerID="11a3914182db445f6a9e39e4bc97e7f3dd08c68c9a0b646cac172ef7b9d21b07" exitCode=0 Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.927950 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" event={"ID":"c5f79273-4f52-4d9f-ab31-5af0123ff34c","Type":"ContainerDied","Data":"11a3914182db445f6a9e39e4bc97e7f3dd08c68c9a0b646cac172ef7b9d21b07"} Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.928009 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" event={"ID":"c5f79273-4f52-4d9f-ab31-5af0123ff34c","Type":"ContainerStarted","Data":"611276c02629ef0f5a6f5a4b86e87c592c692b0303f6a16722779b8cf0191900"} Mar 22 00:20:10 crc kubenswrapper[5116]: I0322 00:20:10.950769 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk68q\" (UniqueName: \"kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:11 crc kubenswrapper[5116]: I0322 00:20:11.093920 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:11 crc kubenswrapper[5116]: I0322 00:20:11.327918 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr"] Mar 22 00:20:11 crc kubenswrapper[5116]: I0322 00:20:11.931902 5116 generic.go:358] "Generic (PLEG): container finished" podID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerID="02b484c4931a69a97cd0e50e5653d639392ec80ded43684e04efe8cb32713b66" exitCode=0 Mar 22 00:20:11 crc kubenswrapper[5116]: I0322 00:20:11.932102 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerDied","Data":"02b484c4931a69a97cd0e50e5653d639392ec80ded43684e04efe8cb32713b66"} Mar 22 00:20:11 crc kubenswrapper[5116]: I0322 00:20:11.932391 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerStarted","Data":"8f102e5b70ff913d33c6363aceb7b3c4ec4bc9b1fbc6f7dce795f8e50c8f34ca"} Mar 22 00:20:12 crc kubenswrapper[5116]: I0322 00:20:12.941285 5116 generic.go:358] "Generic (PLEG): container finished" podID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerID="e9d4922302c59ed93bcb0147ed832a245de045f022360469768b996c2d4dd48a" exitCode=0 Mar 22 00:20:12 crc kubenswrapper[5116]: I0322 00:20:12.941342 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" event={"ID":"c5f79273-4f52-4d9f-ab31-5af0123ff34c","Type":"ContainerDied","Data":"e9d4922302c59ed93bcb0147ed832a245de045f022360469768b996c2d4dd48a"} Mar 22 00:20:12 crc kubenswrapper[5116]: I0322 00:20:12.944512 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerStarted","Data":"0472789dcb22aa7f57ab96d89df83c568bffeabedf523fab6d1804f1f1e4b421"} Mar 22 00:20:13 crc kubenswrapper[5116]: I0322 00:20:13.953539 5116 generic.go:358] "Generic (PLEG): container finished" podID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerID="4d5ea94b2e5d05239242984b64e2bd61b5bb1b1b7acfa1ab37e0108951c2af8d" exitCode=0 Mar 22 00:20:13 crc kubenswrapper[5116]: I0322 00:20:13.953620 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" event={"ID":"c5f79273-4f52-4d9f-ab31-5af0123ff34c","Type":"ContainerDied","Data":"4d5ea94b2e5d05239242984b64e2bd61b5bb1b1b7acfa1ab37e0108951c2af8d"} Mar 22 00:20:13 crc kubenswrapper[5116]: I0322 00:20:13.956761 5116 generic.go:358] "Generic (PLEG): container finished" podID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerID="0472789dcb22aa7f57ab96d89df83c568bffeabedf523fab6d1804f1f1e4b421" exitCode=0 Mar 22 00:20:13 crc kubenswrapper[5116]: I0322 00:20:13.956805 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerDied","Data":"0472789dcb22aa7f57ab96d89df83c568bffeabedf523fab6d1804f1f1e4b421"} Mar 22 00:20:14 crc kubenswrapper[5116]: I0322 00:20:14.966654 5116 generic.go:358] "Generic (PLEG): container finished" podID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerID="2bd61c7843ffe7c87bc9273e0c4e5b0529f922d0bb45e5d57ffe32cbf56fbf16" exitCode=0 Mar 22 00:20:14 crc kubenswrapper[5116]: I0322 00:20:14.966779 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerDied","Data":"2bd61c7843ffe7c87bc9273e0c4e5b0529f922d0bb45e5d57ffe32cbf56fbf16"} Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.304380 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.366860 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util\") pod \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.367067 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xzzn\" (UniqueName: \"kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn\") pod \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.367113 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle\") pod \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\" (UID: \"c5f79273-4f52-4d9f-ab31-5af0123ff34c\") " Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.377426 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle" (OuterVolumeSpecName: "bundle") pod "c5f79273-4f52-4d9f-ab31-5af0123ff34c" (UID: "c5f79273-4f52-4d9f-ab31-5af0123ff34c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.388540 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn" (OuterVolumeSpecName: "kube-api-access-7xzzn") pod "c5f79273-4f52-4d9f-ab31-5af0123ff34c" (UID: "c5f79273-4f52-4d9f-ab31-5af0123ff34c"). InnerVolumeSpecName "kube-api-access-7xzzn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.394282 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util" (OuterVolumeSpecName: "util") pod "c5f79273-4f52-4d9f-ab31-5af0123ff34c" (UID: "c5f79273-4f52-4d9f-ab31-5af0123ff34c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.469706 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7xzzn\" (UniqueName: \"kubernetes.io/projected/c5f79273-4f52-4d9f-ab31-5af0123ff34c-kube-api-access-7xzzn\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.469761 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.469771 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5f79273-4f52-4d9f-ab31-5af0123ff34c-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.975621 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.975644 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn" event={"ID":"c5f79273-4f52-4d9f-ab31-5af0123ff34c","Type":"ContainerDied","Data":"611276c02629ef0f5a6f5a4b86e87c592c692b0303f6a16722779b8cf0191900"} Mar 22 00:20:15 crc kubenswrapper[5116]: I0322 00:20:15.975733 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="611276c02629ef0f5a6f5a4b86e87c592c692b0303f6a16722779b8cf0191900" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.333877 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.382240 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle\") pod \"01e2c74b-adcf-45a0-ab9a-e7375676f470\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.382389 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util\") pod \"01e2c74b-adcf-45a0-ab9a-e7375676f470\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.382435 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk68q\" (UniqueName: \"kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q\") pod \"01e2c74b-adcf-45a0-ab9a-e7375676f470\" (UID: \"01e2c74b-adcf-45a0-ab9a-e7375676f470\") " Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.383133 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle" (OuterVolumeSpecName: "bundle") pod "01e2c74b-adcf-45a0-ab9a-e7375676f470" (UID: "01e2c74b-adcf-45a0-ab9a-e7375676f470"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.390247 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q" (OuterVolumeSpecName: "kube-api-access-tk68q") pod "01e2c74b-adcf-45a0-ab9a-e7375676f470" (UID: "01e2c74b-adcf-45a0-ab9a-e7375676f470"). InnerVolumeSpecName "kube-api-access-tk68q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.397071 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util" (OuterVolumeSpecName: "util") pod "01e2c74b-adcf-45a0-ab9a-e7375676f470" (UID: "01e2c74b-adcf-45a0-ab9a-e7375676f470"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.483967 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.484009 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01e2c74b-adcf-45a0-ab9a-e7375676f470-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.484021 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tk68q\" (UniqueName: \"kubernetes.io/projected/01e2c74b-adcf-45a0-ab9a-e7375676f470-kube-api-access-tk68q\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.983960 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.983956 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr" event={"ID":"01e2c74b-adcf-45a0-ab9a-e7375676f470","Type":"ContainerDied","Data":"8f102e5b70ff913d33c6363aceb7b3c4ec4bc9b1fbc6f7dce795f8e50c8f34ca"} Mar 22 00:20:16 crc kubenswrapper[5116]: I0322 00:20:16.984096 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f102e5b70ff913d33c6363aceb7b3c4ec4bc9b1fbc6f7dce795f8e50c8f34ca" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.004904 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5"] Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005459 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="util" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005478 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="util" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005492 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="pull" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005498 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="pull" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005511 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="util" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005518 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="util" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005528 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005533 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005564 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005571 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005584 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="pull" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005592 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="pull" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005690 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5f79273-4f52-4d9f-ab31-5af0123ff34c" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.005703 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="01e2c74b-adcf-45a0-ab9a-e7375676f470" containerName="extract" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.013050 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.029508 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2ccr\"" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.045853 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5"] Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.094108 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.094175 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.094262 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtlkj\" (UniqueName: \"kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.195568 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtlkj\" (UniqueName: \"kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.195815 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.195860 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.196364 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.196499 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.238454 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtlkj\" (UniqueName: \"kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.327359 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.844470 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5"] Mar 22 00:20:17 crc kubenswrapper[5116]: W0322 00:20:17.862873 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e8a1901_425e_4555_a4f0_fd2ae65d7fb8.slice/crio-306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8 WatchSource:0}: Error finding container 306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8: Status 404 returned error can't find the container with id 306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8 Mar 22 00:20:17 crc kubenswrapper[5116]: I0322 00:20:17.992033 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" event={"ID":"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8","Type":"ContainerStarted","Data":"306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8"} Mar 22 00:20:19 crc kubenswrapper[5116]: I0322 00:20:19.001509 5116 generic.go:358] "Generic (PLEG): container finished" podID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerID="7812ba22ca5c8e825adbaf832ba6d4439e56aada267ef4753eebb6d66c29afb3" exitCode=0 Mar 22 00:20:19 crc kubenswrapper[5116]: I0322 00:20:19.002084 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" event={"ID":"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8","Type":"ContainerDied","Data":"7812ba22ca5c8e825adbaf832ba6d4439e56aada267ef4753eebb6d66c29afb3"} Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.133644 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-55568fc96c-krbrc"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.141924 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.145797 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.146233 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"obo-prometheus-operator-dockercfg-8g6jf\"" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.146486 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.157986 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-55568fc96c-krbrc"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.243382 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqfgm\" (UniqueName: \"kubernetes.io/projected/1a434146-4e47-4733-9f73-955a4c92f2d2-kube-api-access-gqfgm\") pod \"obo-prometheus-operator-55568fc96c-krbrc\" (UID: \"1a434146-4e47-4733-9f73-955a4c92f2d2\") " pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.345136 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqfgm\" (UniqueName: \"kubernetes.io/projected/1a434146-4e47-4733-9f73-955a4c92f2d2-kube-api-access-gqfgm\") pod \"obo-prometheus-operator-55568fc96c-krbrc\" (UID: \"1a434146-4e47-4733-9f73-955a4c92f2d2\") " pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.380204 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqfgm\" (UniqueName: \"kubernetes.io/projected/1a434146-4e47-4733-9f73-955a4c92f2d2-kube-api-access-gqfgm\") pod \"obo-prometheus-operator-55568fc96c-krbrc\" (UID: \"1a434146-4e47-4733-9f73-955a4c92f2d2\") " pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.459629 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.465744 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.470318 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.472804 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-dockercfg-9cpr6\"" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.473206 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"obo-prometheus-operator-admission-webhook-service-cert\"" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.484221 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.494619 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.502995 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.528364 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd"] Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.548724 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.549283 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.650993 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.651096 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.651143 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.651204 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.678759 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.683268 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2d0f143c-b305-43e1-937e-020d84101219-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf\" (UID: \"2d0f143c-b305-43e1-937e-020d84101219\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.755859 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.755918 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.764739 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.768580 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/399d0e55-3ad7-48ad-ab17-d0ab1fb9879f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd\" (UID: \"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.834299 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.845114 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" Mar 22 00:20:20 crc kubenswrapper[5116]: I0322 00:20:20.851597 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-55568fc96c-krbrc"] Mar 22 00:20:20 crc kubenswrapper[5116]: W0322 00:20:20.892025 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a434146_4e47_4733_9f73_955a4c92f2d2.slice/crio-ecf0077f042be66d0f2f0f3e81456e7ae9603ae07d97cfd619efb877f31f6398 WatchSource:0}: Error finding container ecf0077f042be66d0f2f0f3e81456e7ae9603ae07d97cfd619efb877f31f6398: Status 404 returned error can't find the container with id ecf0077f042be66d0f2f0f3e81456e7ae9603ae07d97cfd619efb877f31f6398 Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.027385 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" event={"ID":"1a434146-4e47-4733-9f73-955a4c92f2d2","Type":"ContainerStarted","Data":"ecf0077f042be66d0f2f0f3e81456e7ae9603ae07d97cfd619efb877f31f6398"} Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.106223 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-587f9c8867-sxrpm"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.118810 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.122588 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"observability-operator-tls\"" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.125902 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-587f9c8867-sxrpm"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.132613 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"observability-operator-sa-dockercfg-fsqpl\"" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.202409 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf"] Mar 22 00:20:21 crc kubenswrapper[5116]: W0322 00:20:21.216953 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d0f143c_b305_43e1_937e_020d84101219.slice/crio-e5f7817d368ac6b93e25f4608abba19675448038b7b4e589127d6bf0de5ee6a0 WatchSource:0}: Error finding container e5f7817d368ac6b93e25f4608abba19675448038b7b4e589127d6bf0de5ee6a0: Status 404 returned error can't find the container with id e5f7817d368ac6b93e25f4608abba19675448038b7b4e589127d6bf0de5ee6a0 Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.281502 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcxvb\" (UniqueName: \"kubernetes.io/projected/b998a8ef-dbc2-4004-a589-608b0bf774e7-kube-api-access-mcxvb\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.281741 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b998a8ef-dbc2-4004-a589-608b0bf774e7-observability-operator-tls\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.382976 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcxvb\" (UniqueName: \"kubernetes.io/projected/b998a8ef-dbc2-4004-a589-608b0bf774e7-kube-api-access-mcxvb\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.383503 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b998a8ef-dbc2-4004-a589-608b0bf774e7-observability-operator-tls\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.389714 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b998a8ef-dbc2-4004-a589-608b0bf774e7-observability-operator-tls\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.404505 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcxvb\" (UniqueName: \"kubernetes.io/projected/b998a8ef-dbc2-4004-a589-608b0bf774e7-kube-api-access-mcxvb\") pod \"observability-operator-587f9c8867-sxrpm\" (UID: \"b998a8ef-dbc2-4004-a589-608b0bf774e7\") " pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.462649 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.479292 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bff5dbc55-tpg7b"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.496432 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.500618 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"perses-operator-dockercfg-nftvh\"" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.502932 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"perses-operator-service-cert\"" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.502966 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bff5dbc55-tpg7b"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.540014 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.541793 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5d9d95bf5b-5kxj9" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.586346 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-webhook-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.586386 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6k9t\" (UniqueName: \"kubernetes.io/projected/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-kube-api-access-m6k9t\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.586427 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-apiservice-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.586470 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-openshift-service-ca\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.624755 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.691186 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-webhook-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.691232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6k9t\" (UniqueName: \"kubernetes.io/projected/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-kube-api-access-m6k9t\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.691274 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-apiservice-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.691332 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-openshift-service-ca\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.693239 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-openshift-service-ca\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.701629 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-apiservice-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.709018 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-webhook-cert\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.753804 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6k9t\" (UniqueName: \"kubernetes.io/projected/9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8-kube-api-access-m6k9t\") pod \"perses-operator-5bff5dbc55-tpg7b\" (UID: \"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8\") " pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.840478 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:21 crc kubenswrapper[5116]: I0322 00:20:21.998573 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-587f9c8867-sxrpm"] Mar 22 00:20:22 crc kubenswrapper[5116]: I0322 00:20:22.051192 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" event={"ID":"2d0f143c-b305-43e1-937e-020d84101219","Type":"ContainerStarted","Data":"e5f7817d368ac6b93e25f4608abba19675448038b7b4e589127d6bf0de5ee6a0"} Mar 22 00:20:22 crc kubenswrapper[5116]: I0322 00:20:22.062304 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" event={"ID":"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f","Type":"ContainerStarted","Data":"46d0188e1e2cdb419883fb6b789d161cdd04754e91f713391cfb9e073d58d375"} Mar 22 00:20:22 crc kubenswrapper[5116]: I0322 00:20:22.534156 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bff5dbc55-tpg7b"] Mar 22 00:20:23 crc kubenswrapper[5116]: I0322 00:20:23.076123 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" event={"ID":"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8","Type":"ContainerStarted","Data":"729be4ebb6d35cfc2cdd555bbb3c39b505dbc03a2911d424cd5d0a142b7d3a72"} Mar 22 00:20:23 crc kubenswrapper[5116]: I0322 00:20:23.078249 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" event={"ID":"b998a8ef-dbc2-4004-a589-608b0bf774e7","Type":"ContainerStarted","Data":"86654e9bd3a4bdcad31031b04ded59f594b12c4984c95eb018db3d270a78c385"} Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.812385 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-58c4bc569-nwp4h"] Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.907359 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-58c4bc569-nwp4h"] Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.907527 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.910794 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"openshift-service-ca.crt\"" Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.911074 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elastic-operator-service-cert\"" Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.911067 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elastic-operator-dockercfg-jvfh8\"" Mar 22 00:20:24 crc kubenswrapper[5116]: I0322 00:20:24.912221 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"kube-root-ca.crt\"" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.066625 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-webhook-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.066770 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-apiservice-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.066820 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dxj\" (UniqueName: \"kubernetes.io/projected/d6bec193-8107-440f-89aa-944885708496-kube-api-access-27dxj\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.167930 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27dxj\" (UniqueName: \"kubernetes.io/projected/d6bec193-8107-440f-89aa-944885708496-kube-api-access-27dxj\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.168004 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-webhook-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.168098 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-apiservice-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.177819 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-webhook-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.179718 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6bec193-8107-440f-89aa-944885708496-apiservice-cert\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.198890 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27dxj\" (UniqueName: \"kubernetes.io/projected/d6bec193-8107-440f-89aa-944885708496-kube-api-access-27dxj\") pod \"elastic-operator-58c4bc569-nwp4h\" (UID: \"d6bec193-8107-440f-89aa-944885708496\") " pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:25 crc kubenswrapper[5116]: I0322 00:20:25.243854 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.569415 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-78b9bd8798-xdlcm"] Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.589747 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-78b9bd8798-xdlcm"] Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.589829 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.601036 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"interconnect-operator-dockercfg-dfbpf\"" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.701608 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn98v\" (UniqueName: \"kubernetes.io/projected/73441892-3e06-43b0-bb99-44e4ff5f74b9-kube-api-access-qn98v\") pod \"interconnect-operator-78b9bd8798-xdlcm\" (UID: \"73441892-3e06-43b0-bb99-44e4ff5f74b9\") " pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.803490 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn98v\" (UniqueName: \"kubernetes.io/projected/73441892-3e06-43b0-bb99-44e4ff5f74b9-kube-api-access-qn98v\") pod \"interconnect-operator-78b9bd8798-xdlcm\" (UID: \"73441892-3e06-43b0-bb99-44e4ff5f74b9\") " pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.854791 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn98v\" (UniqueName: \"kubernetes.io/projected/73441892-3e06-43b0-bb99-44e4ff5f74b9-kube-api-access-qn98v\") pod \"interconnect-operator-78b9bd8798-xdlcm\" (UID: \"73441892-3e06-43b0-bb99-44e4ff5f74b9\") " pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" Mar 22 00:20:26 crc kubenswrapper[5116]: I0322 00:20:26.937997 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" Mar 22 00:20:38 crc kubenswrapper[5116]: I0322 00:20:38.416247 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-58c4bc569-nwp4h"] Mar 22 00:20:38 crc kubenswrapper[5116]: W0322 00:20:38.447696 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6bec193_8107_440f_89aa_944885708496.slice/crio-c29d5a21ed50743a2727d3f589a43dc3161559525cf33b0aec00fe8b4d7903d6 WatchSource:0}: Error finding container c29d5a21ed50743a2727d3f589a43dc3161559525cf33b0aec00fe8b4d7903d6: Status 404 returned error can't find the container with id c29d5a21ed50743a2727d3f589a43dc3161559525cf33b0aec00fe8b4d7903d6 Mar 22 00:20:38 crc kubenswrapper[5116]: I0322 00:20:38.474140 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-78b9bd8798-xdlcm"] Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.291506 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" event={"ID":"b998a8ef-dbc2-4004-a589-608b0bf774e7","Type":"ContainerStarted","Data":"54ec4333cb3d2e1db09c103bcc0e2e3c3ecf291a9bed2554865581fdce46c1de"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.292644 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.297027 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" event={"ID":"73441892-3e06-43b0-bb99-44e4ff5f74b9","Type":"ContainerStarted","Data":"57c732f90f4bf5ab948c0d7af0f13f077b6b0d7135d178131004699f5b9d820f"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.303562 5116 generic.go:358] "Generic (PLEG): container finished" podID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerID="72a12537d821669d307e258cccd0d709b04bdded6776d3e5e7c34f570cfd88f0" exitCode=0 Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.303668 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" event={"ID":"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8","Type":"ContainerDied","Data":"72a12537d821669d307e258cccd0d709b04bdded6776d3e5e7c34f570cfd88f0"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.305003 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.314370 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" event={"ID":"399d0e55-3ad7-48ad-ab17-d0ab1fb9879f","Type":"ContainerStarted","Data":"8508c3340f68831ac8d738c014b16a782b7867bcb0454c1208a5ab81440b17b8"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.324674 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" event={"ID":"d6bec193-8107-440f-89aa-944885708496","Type":"ContainerStarted","Data":"c29d5a21ed50743a2727d3f589a43dc3161559525cf33b0aec00fe8b4d7903d6"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.330997 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" event={"ID":"2d0f143c-b305-43e1-937e-020d84101219","Type":"ContainerStarted","Data":"507d4b77ff13cc5afb466cae703417a9f31f021c083dde7a6f803f295bfddbdb"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.338519 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-587f9c8867-sxrpm" podStartSLOduration=2.3046954680000002 podStartE2EDuration="18.338487605s" podCreationTimestamp="2026-03-22 00:20:21 +0000 UTC" firstStartedPulling="2026-03-22 00:20:22.065066903 +0000 UTC m=+693.087368276" lastFinishedPulling="2026-03-22 00:20:38.09885904 +0000 UTC m=+709.121160413" observedRunningTime="2026-03-22 00:20:39.32963265 +0000 UTC m=+710.351934033" watchObservedRunningTime="2026-03-22 00:20:39.338487605 +0000 UTC m=+710.360788978" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.341230 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" event={"ID":"1a434146-4e47-4733-9f73-955a4c92f2d2","Type":"ContainerStarted","Data":"1a2f57f8fa48bde4099bcfe0277665e66c74a352b0b99625ed585cb64c83d9ac"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.343774 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" event={"ID":"9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8","Type":"ContainerStarted","Data":"6fa7a96ea3f27e4ce7da6bfa8c56107e03adea13d834bc0f2be761d178afe464"} Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.344079 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.444871 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd" podStartSLOduration=2.915826994 podStartE2EDuration="19.444840463s" podCreationTimestamp="2026-03-22 00:20:20 +0000 UTC" firstStartedPulling="2026-03-22 00:20:21.568450576 +0000 UTC m=+692.590751949" lastFinishedPulling="2026-03-22 00:20:38.097464045 +0000 UTC m=+709.119765418" observedRunningTime="2026-03-22 00:20:39.352392463 +0000 UTC m=+710.374693866" watchObservedRunningTime="2026-03-22 00:20:39.444840463 +0000 UTC m=+710.467141836" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.520784 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" podStartSLOduration=2.982118432 podStartE2EDuration="18.52076251s" podCreationTimestamp="2026-03-22 00:20:21 +0000 UTC" firstStartedPulling="2026-03-22 00:20:22.581940392 +0000 UTC m=+693.604241775" lastFinishedPulling="2026-03-22 00:20:38.12058448 +0000 UTC m=+709.142885853" observedRunningTime="2026-03-22 00:20:39.510806439 +0000 UTC m=+710.533107832" watchObservedRunningTime="2026-03-22 00:20:39.52076251 +0000 UTC m=+710.543063883" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.554886 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-55568fc96c-krbrc" podStartSLOduration=2.351570606 podStartE2EDuration="19.554861179s" podCreationTimestamp="2026-03-22 00:20:20 +0000 UTC" firstStartedPulling="2026-03-22 00:20:20.895562327 +0000 UTC m=+691.917863700" lastFinishedPulling="2026-03-22 00:20:38.0988529 +0000 UTC m=+709.121154273" observedRunningTime="2026-03-22 00:20:39.537333075 +0000 UTC m=+710.559634448" watchObservedRunningTime="2026-03-22 00:20:39.554861179 +0000 UTC m=+710.577162552" Mar 22 00:20:39 crc kubenswrapper[5116]: I0322 00:20:39.591119 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf" podStartSLOduration=2.692242458 podStartE2EDuration="19.590928872s" podCreationTimestamp="2026-03-22 00:20:20 +0000 UTC" firstStartedPulling="2026-03-22 00:20:21.221889476 +0000 UTC m=+692.244190849" lastFinishedPulling="2026-03-22 00:20:38.12057587 +0000 UTC m=+709.142877263" observedRunningTime="2026-03-22 00:20:39.58156837 +0000 UTC m=+710.603869743" watchObservedRunningTime="2026-03-22 00:20:39.590928872 +0000 UTC m=+710.613230245" Mar 22 00:20:40 crc kubenswrapper[5116]: I0322 00:20:40.358437 5116 generic.go:358] "Generic (PLEG): container finished" podID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerID="036133fcef38482e0fc86cfb1126c8a7e10691ba008834fd95bf6087c849dbbf" exitCode=0 Mar 22 00:20:40 crc kubenswrapper[5116]: I0322 00:20:40.359333 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" event={"ID":"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8","Type":"ContainerDied","Data":"036133fcef38482e0fc86cfb1126c8a7e10691ba008834fd95bf6087c849dbbf"} Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.763687 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.911378 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtlkj\" (UniqueName: \"kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj\") pod \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.911635 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle\") pod \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.911717 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util\") pod \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\" (UID: \"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8\") " Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.914029 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle" (OuterVolumeSpecName: "bundle") pod "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" (UID: "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.930355 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util" (OuterVolumeSpecName: "util") pod "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" (UID: "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:43 crc kubenswrapper[5116]: I0322 00:20:43.936441 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj" (OuterVolumeSpecName: "kube-api-access-xtlkj") pod "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" (UID: "7e8a1901-425e-4555-a4f0-fd2ae65d7fb8"). InnerVolumeSpecName "kube-api-access-xtlkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.014784 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xtlkj\" (UniqueName: \"kubernetes.io/projected/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-kube-api-access-xtlkj\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.014850 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.014866 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e8a1901-425e-4555-a4f0-fd2ae65d7fb8-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.401709 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.401751 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5" event={"ID":"7e8a1901-425e-4555-a4f0-fd2ae65d7fb8","Type":"ContainerDied","Data":"306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8"} Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.401781 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306e4f01729ad8bbfcae9ed407b8f8ebfb81cfe346d54b8c7c4121e0ad254dc8" Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.404822 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" event={"ID":"d6bec193-8107-440f-89aa-944885708496","Type":"ContainerStarted","Data":"0636ada234767a36a9cf7f3c546857611e34e9c4bd551b77148da4ff25e24b74"} Mar 22 00:20:44 crc kubenswrapper[5116]: I0322 00:20:44.428817 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-58c4bc569-nwp4h" podStartSLOduration=15.099320026000001 podStartE2EDuration="20.428786694s" podCreationTimestamp="2026-03-22 00:20:24 +0000 UTC" firstStartedPulling="2026-03-22 00:20:38.457681085 +0000 UTC m=+709.479982458" lastFinishedPulling="2026-03-22 00:20:43.787147753 +0000 UTC m=+714.809449126" observedRunningTime="2026-03-22 00:20:44.425882761 +0000 UTC m=+715.448184134" watchObservedRunningTime="2026-03-22 00:20:44.428786694 +0000 UTC m=+715.451088067" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.225921 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227056 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="extract" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227077 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="extract" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227101 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="util" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227108 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="util" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227147 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="pull" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227157 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="pull" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.227291 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e8a1901-425e-4555-a4f0-fd2ae65d7fb8" containerName="extract" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.368237 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.368415 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.371594 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-remote-ca\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.373848 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-default-es-config\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.374043 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-default-es-transport-certs\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.374660 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-dockercfg-crp6c\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.374797 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"elasticsearch-es-scripts\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.375038 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-http-certs-internal\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.375183 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"elasticsearch-es-unicast-hosts\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.375321 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-xpack-file-realm\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.378367 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-internal-users\"" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.542438 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.542550 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.542707 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.542922 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.542971 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543086 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543154 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543338 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543396 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543478 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543587 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543613 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543635 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543683 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.543710 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644510 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644567 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644607 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644633 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644675 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644708 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644735 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.644753 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645261 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645354 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645410 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645663 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645795 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.645849 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646071 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646088 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646212 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646120 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646272 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646309 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646469 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.646489 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.647951 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.653204 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.657216 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.657271 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.657904 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.657924 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.659462 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.663352 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/ccb103ab-2a74-44b8-b853-0da2e0b4a6b5-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:45 crc kubenswrapper[5116]: I0322 00:20:45.695855 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:20:46 crc kubenswrapper[5116]: I0322 00:20:46.676725 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerName="registry" containerID="cri-o://c568767e3e8c6d00f1721bf03851b9ec54aeb271e4894892ac0cc16a9a33722c" gracePeriod=30 Mar 22 00:20:47 crc kubenswrapper[5116]: I0322 00:20:47.441192 5116 generic.go:358] "Generic (PLEG): container finished" podID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerID="c568767e3e8c6d00f1721bf03851b9ec54aeb271e4894892ac0cc16a9a33722c" exitCode=0 Mar 22 00:20:47 crc kubenswrapper[5116]: I0322 00:20:47.441299 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" event={"ID":"36ff6a0d-ec37-48dd-9e2b-01bcb5755738","Type":"ContainerDied","Data":"c568767e3e8c6d00f1721bf03851b9ec54aeb271e4894892ac0cc16a9a33722c"} Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.107131 5116 patch_prober.go:28] interesting pod/image-registry-66587d64c8-zwkhp container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.16:5000/healthz\": dial tcp 10.217.0.16:5000: connect: connection refused" start-of-body= Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.107503 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.16:5000/healthz\": dial tcp 10.217.0.16:5000: connect: connection refused" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.139478 5116 scope.go:117] "RemoveContainer" containerID="2acebccbc85d9eff1c121aca735947ed6d77f0c1bc6b89aca01a5fc1d6de9f77" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.374196 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bff5dbc55-tpg7b" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.444570 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.478792 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" event={"ID":"36ff6a0d-ec37-48dd-9e2b-01bcb5755738","Type":"ContainerDied","Data":"bca4de0caba9859b14c3f0eb17a3776e71425e24f9833420070510404cc3406c"} Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.478869 5116 scope.go:117] "RemoveContainer" containerID="c568767e3e8c6d00f1721bf03851b9ec54aeb271e4894892ac0cc16a9a33722c" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.479091 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66587d64c8-zwkhp" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.530755 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl2cn\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.531189 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.531559 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.531729 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.531855 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.532090 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.532142 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.532208 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets\") pod \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\" (UID: \"36ff6a0d-ec37-48dd-9e2b-01bcb5755738\") " Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.532803 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.533036 5116 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.537744 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.544698 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.547498 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.549233 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2" (OuterVolumeSpecName: "registry-storage") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "pvc-b21f41aa-58d4-44b1-aeaa-280a8e32ddf2". PluginName "kubernetes.io/csi", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.552638 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.554926 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.563211 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn" (OuterVolumeSpecName: "kube-api-access-hl2cn") pod "36ff6a0d-ec37-48dd-9e2b-01bcb5755738" (UID: "36ff6a0d-ec37-48dd-9e2b-01bcb5755738"). InnerVolumeSpecName "kube-api-access-hl2cn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.613356 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633805 5116 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633833 5116 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633864 5116 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633877 5116 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633891 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hl2cn\" (UniqueName: \"kubernetes.io/projected/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-kube-api-access-hl2cn\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.633900 5116 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/36ff6a0d-ec37-48dd-9e2b-01bcb5755738-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.811735 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:20:50 crc kubenswrapper[5116]: I0322 00:20:50.816802 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66587d64c8-zwkhp"] Mar 22 00:20:51 crc kubenswrapper[5116]: I0322 00:20:51.496161 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5","Type":"ContainerStarted","Data":"b5fa4ee0dfec28e69116324034f41cf34a2fef085e700e1eaca593926850691b"} Mar 22 00:20:51 crc kubenswrapper[5116]: I0322 00:20:51.504109 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" event={"ID":"73441892-3e06-43b0-bb99-44e4ff5f74b9","Type":"ContainerStarted","Data":"6fcfee34878c0e0e032a6302107a8d64e0027a2ff9e4729fdce9f93c8ade0b61"} Mar 22 00:20:51 crc kubenswrapper[5116]: I0322 00:20:51.518702 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-78b9bd8798-xdlcm" podStartSLOduration=13.673626277 podStartE2EDuration="25.518684564s" podCreationTimestamp="2026-03-22 00:20:26 +0000 UTC" firstStartedPulling="2026-03-22 00:20:38.48575944 +0000 UTC m=+709.508060813" lastFinishedPulling="2026-03-22 00:20:50.330817717 +0000 UTC m=+721.353119100" observedRunningTime="2026-03-22 00:20:51.517552107 +0000 UTC m=+722.539853480" watchObservedRunningTime="2026-03-22 00:20:51.518684564 +0000 UTC m=+722.540985937" Mar 22 00:20:51 crc kubenswrapper[5116]: I0322 00:20:51.709874 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" path="/var/lib/kubelet/pods/36ff6a0d-ec37-48dd-9e2b-01bcb5755738/volumes" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.687476 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l"] Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.689643 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerName="registry" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.689661 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerName="registry" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.689801 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="36ff6a0d-ec37-48dd-9e2b-01bcb5755738" containerName="registry" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.751994 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l"] Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.752105 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.756958 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.757310 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-z8t8s\"" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.757497 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.835183 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjg4k\" (UniqueName: \"kubernetes.io/projected/8aad28fd-d043-497c-bd68-7d3515fd76cf-kube-api-access-bjg4k\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.835356 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8aad28fd-d043-497c-bd68-7d3515fd76cf-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.936555 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjg4k\" (UniqueName: \"kubernetes.io/projected/8aad28fd-d043-497c-bd68-7d3515fd76cf-kube-api-access-bjg4k\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.936684 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8aad28fd-d043-497c-bd68-7d3515fd76cf-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.937333 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8aad28fd-d043-497c-bd68-7d3515fd76cf-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:56 crc kubenswrapper[5116]: I0322 00:20:56.972314 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjg4k\" (UniqueName: \"kubernetes.io/projected/8aad28fd-d043-497c-bd68-7d3515fd76cf-kube-api-access-bjg4k\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-smk8l\" (UID: \"8aad28fd-d043-497c-bd68-7d3515fd76cf\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:57 crc kubenswrapper[5116]: I0322 00:20:57.069590 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" Mar 22 00:20:59 crc kubenswrapper[5116]: I0322 00:20:59.306193 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l"] Mar 22 00:20:59 crc kubenswrapper[5116]: W0322 00:20:59.322142 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8aad28fd_d043_497c_bd68_7d3515fd76cf.slice/crio-670533228bdfe4c62fea7ae72467e1c84e554e057c1c92fb652349b1cb818296 WatchSource:0}: Error finding container 670533228bdfe4c62fea7ae72467e1c84e554e057c1c92fb652349b1cb818296: Status 404 returned error can't find the container with id 670533228bdfe4c62fea7ae72467e1c84e554e057c1c92fb652349b1cb818296 Mar 22 00:20:59 crc kubenswrapper[5116]: I0322 00:20:59.578097 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" event={"ID":"8aad28fd-d043-497c-bd68-7d3515fd76cf","Type":"ContainerStarted","Data":"670533228bdfe4c62fea7ae72467e1c84e554e057c1c92fb652349b1cb818296"} Mar 22 00:21:14 crc kubenswrapper[5116]: I0322 00:21:14.698494 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" event={"ID":"8aad28fd-d043-497c-bd68-7d3515fd76cf","Type":"ContainerStarted","Data":"b7e243cfb52fe05a829b147c433706c9d55a2f73de1fe99f38444f54f4b18ed0"} Mar 22 00:21:14 crc kubenswrapper[5116]: I0322 00:21:14.701060 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5","Type":"ContainerStarted","Data":"6d24febd3fe6a1d1435e03e9cf028a5983f2b9b3281aa427ae12f63a2758bb7b"} Mar 22 00:21:14 crc kubenswrapper[5116]: I0322 00:21:14.724237 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-smk8l" podStartSLOduration=3.845346365 podStartE2EDuration="18.724212517s" podCreationTimestamp="2026-03-22 00:20:56 +0000 UTC" firstStartedPulling="2026-03-22 00:20:59.326131261 +0000 UTC m=+730.348432634" lastFinishedPulling="2026-03-22 00:21:14.204997413 +0000 UTC m=+745.227298786" observedRunningTime="2026-03-22 00:21:14.721095437 +0000 UTC m=+745.743396840" watchObservedRunningTime="2026-03-22 00:21:14.724212517 +0000 UTC m=+745.746513890" Mar 22 00:21:14 crc kubenswrapper[5116]: I0322 00:21:14.931831 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 22 00:21:14 crc kubenswrapper[5116]: I0322 00:21:14.968697 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 22 00:21:16 crc kubenswrapper[5116]: I0322 00:21:16.714937 5116 generic.go:358] "Generic (PLEG): container finished" podID="ccb103ab-2a74-44b8-b853-0da2e0b4a6b5" containerID="6d24febd3fe6a1d1435e03e9cf028a5983f2b9b3281aa427ae12f63a2758bb7b" exitCode=0 Mar 22 00:21:16 crc kubenswrapper[5116]: I0322 00:21:16.715421 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5","Type":"ContainerDied","Data":"6d24febd3fe6a1d1435e03e9cf028a5983f2b9b3281aa427ae12f63a2758bb7b"} Mar 22 00:21:17 crc kubenswrapper[5116]: I0322 00:21:17.723682 5116 generic.go:358] "Generic (PLEG): container finished" podID="ccb103ab-2a74-44b8-b853-0da2e0b4a6b5" containerID="3e1c524b1c5bc74c3c638ea86eeead40ea3a9c7c3a14e091a809e129a835dca8" exitCode=0 Mar 22 00:21:17 crc kubenswrapper[5116]: I0322 00:21:17.723796 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5","Type":"ContainerDied","Data":"3e1c524b1c5bc74c3c638ea86eeead40ea3a9c7c3a14e091a809e129a835dca8"} Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.353953 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8tskc"] Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.358980 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.361809 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.361829 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-n648w\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.362480 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.374505 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8tskc"] Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.485486 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.485820 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46wr7\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-kube-api-access-46wr7\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.587511 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.587616 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46wr7\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-kube-api-access-46wr7\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.609940 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.611369 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46wr7\" (UniqueName: \"kubernetes.io/projected/57a02b12-f3d8-4264-af22-4fb7bc40602f-kube-api-access-46wr7\") pod \"cert-manager-webhook-597b96b99b-8tskc\" (UID: \"57a02b12-f3d8-4264-af22-4fb7bc40602f\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.625914 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.631955 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.634160 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-1-sys-config\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.634522 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-1-global-ca\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.634732 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-1-ca\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.635935 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.645351 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.676429 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.737372 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"ccb103ab-2a74-44b8-b853-0da2e0b4a6b5","Type":"ContainerStarted","Data":"be87eb9395ad9052d5f304afc3379527d4242f64eab5335d22007c30a2e0f4c7"} Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.737527 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.772267 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=10.061333539 podStartE2EDuration="33.772238912s" podCreationTimestamp="2026-03-22 00:20:45 +0000 UTC" firstStartedPulling="2026-03-22 00:20:50.630089333 +0000 UTC m=+721.652390706" lastFinishedPulling="2026-03-22 00:21:14.340994706 +0000 UTC m=+745.363296079" observedRunningTime="2026-03-22 00:21:18.766866929 +0000 UTC m=+749.789168302" watchObservedRunningTime="2026-03-22 00:21:18.772238912 +0000 UTC m=+749.794540295" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.789881 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.789940 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.789967 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.789997 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790030 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790048 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790066 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790092 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790130 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790185 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790220 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.790248 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcqmz\" (UniqueName: \"kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.892344 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.892920 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.892978 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893036 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893070 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893089 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcqmz\" (UniqueName: \"kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893148 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893204 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893230 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893263 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893327 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893351 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893453 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.893495 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.894396 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.894410 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.894824 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.895206 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.895961 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.896378 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.896483 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.904281 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.905116 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.917896 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcqmz\" (UniqueName: \"kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz\") pod \"service-telemetry-operator-1-build\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.962947 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:18 crc kubenswrapper[5116]: I0322 00:21:18.971409 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8tskc"] Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.400818 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:19 crc kubenswrapper[5116]: W0322 00:21:19.410545 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64d31296_d621_4ea6_9145_d61b41e2f2f8.slice/crio-1ed3d96a6af3d62515a9799736475729996fdf1fec128d2904bdadd2196b09d8 WatchSource:0}: Error finding container 1ed3d96a6af3d62515a9799736475729996fdf1fec128d2904bdadd2196b09d8: Status 404 returned error can't find the container with id 1ed3d96a6af3d62515a9799736475729996fdf1fec128d2904bdadd2196b09d8 Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.547913 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-mnbt9"] Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.553080 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.556079 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-h5hnt\"" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.557912 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-mnbt9"] Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.704030 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.704598 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqxdb\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-kube-api-access-kqxdb\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.760507 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"64d31296-d621-4ea6-9145-d61b41e2f2f8","Type":"ContainerStarted","Data":"1ed3d96a6af3d62515a9799736475729996fdf1fec128d2904bdadd2196b09d8"} Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.763238 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" event={"ID":"57a02b12-f3d8-4264-af22-4fb7bc40602f","Type":"ContainerStarted","Data":"b34638f3e93a0bacb5347bb40ec5c785a71a63c33097eebac668cd34743cbfd8"} Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.807133 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.807923 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqxdb\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-kube-api-access-kqxdb\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.841808 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.842564 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqxdb\" (UniqueName: \"kubernetes.io/projected/cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69-kube-api-access-kqxdb\") pod \"cert-manager-cainjector-8966b78d4-mnbt9\" (UID: \"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:19 crc kubenswrapper[5116]: I0322 00:21:19.871546 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" Mar 22 00:21:20 crc kubenswrapper[5116]: I0322 00:21:20.104453 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-mnbt9"] Mar 22 00:21:20 crc kubenswrapper[5116]: I0322 00:21:20.779274 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" event={"ID":"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69","Type":"ContainerStarted","Data":"06d769a2727f985e1eb4f486ba4dbd4f8b0b1367735c8fcef2d93b288e543534"} Mar 22 00:21:23 crc kubenswrapper[5116]: I0322 00:21:23.057431 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:21:23 crc kubenswrapper[5116]: I0322 00:21:23.058027 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.650188 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.858009 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" event={"ID":"57a02b12-f3d8-4264-af22-4fb7bc40602f","Type":"ContainerStarted","Data":"aceab5083cda3271b1f244d2f653dc7cb9e730004ed36a2c861faf10dd96361b"} Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.858153 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.860970 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" event={"ID":"cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69","Type":"ContainerStarted","Data":"63e33c9dcfd4aeaabf58f24891e433e39748f03efbaa7b9bada0d7cde549e657"} Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.863096 5116 generic.go:358] "Generic (PLEG): container finished" podID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerID="f3b30650b7f17897ba5577d414b6e7254548d88f736d4402f1743debfd49c680" exitCode=0 Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.863220 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"64d31296-d621-4ea6-9145-d61b41e2f2f8","Type":"ContainerDied","Data":"f3b30650b7f17897ba5577d414b6e7254548d88f736d4402f1743debfd49c680"} Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.885857 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" podStartSLOduration=2.220774982 podStartE2EDuration="10.885826953s" podCreationTimestamp="2026-03-22 00:21:18 +0000 UTC" firstStartedPulling="2026-03-22 00:21:19.022878791 +0000 UTC m=+750.045180164" lastFinishedPulling="2026-03-22 00:21:27.687930762 +0000 UTC m=+758.710232135" observedRunningTime="2026-03-22 00:21:28.879253191 +0000 UTC m=+759.901554624" watchObservedRunningTime="2026-03-22 00:21:28.885826953 +0000 UTC m=+759.908128366" Mar 22 00:21:28 crc kubenswrapper[5116]: I0322 00:21:28.941126 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-mnbt9" podStartSLOduration=2.385827493 podStartE2EDuration="9.941095034s" podCreationTimestamp="2026-03-22 00:21:19 +0000 UTC" firstStartedPulling="2026-03-22 00:21:20.126392919 +0000 UTC m=+751.148694292" lastFinishedPulling="2026-03-22 00:21:27.68166046 +0000 UTC m=+758.703961833" observedRunningTime="2026-03-22 00:21:28.936766754 +0000 UTC m=+759.959068137" watchObservedRunningTime="2026-03-22 00:21:28.941095034 +0000 UTC m=+759.963396407" Mar 22 00:21:29 crc kubenswrapper[5116]: I0322 00:21:29.840814 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="ccb103ab-2a74-44b8-b853-0da2e0b4a6b5" containerName="elasticsearch" probeResult="failure" output=< Mar 22 00:21:29 crc kubenswrapper[5116]: {"timestamp": "2026-03-22T00:21:29+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 22 00:21:29 crc kubenswrapper[5116]: > Mar 22 00:21:29 crc kubenswrapper[5116]: I0322 00:21:29.873924 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"64d31296-d621-4ea6-9145-d61b41e2f2f8","Type":"ContainerStarted","Data":"0eeebb3d3c244e878a2e5436278966e18e43bd8b1746ed0cac55d3a481273273"} Mar 22 00:21:29 crc kubenswrapper[5116]: I0322 00:21:29.874097 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="docker-build" containerID="cri-o://0eeebb3d3c244e878a2e5436278966e18e43bd8b1746ed0cac55d3a481273273" gracePeriod=30 Mar 22 00:21:29 crc kubenswrapper[5116]: I0322 00:21:29.907509 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=3.625996164 podStartE2EDuration="11.907479542s" podCreationTimestamp="2026-03-22 00:21:18 +0000 UTC" firstStartedPulling="2026-03-22 00:21:19.41436989 +0000 UTC m=+750.436671263" lastFinishedPulling="2026-03-22 00:21:27.695853268 +0000 UTC m=+758.718154641" observedRunningTime="2026-03-22 00:21:29.896955312 +0000 UTC m=+760.919256685" watchObservedRunningTime="2026-03-22 00:21:29.907479542 +0000 UTC m=+760.929780915" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.314997 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.634644 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.634823 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.637554 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-2-global-ca\"" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.637676 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-2-sys-config\"" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.640079 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-2-ca\"" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682246 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682328 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682408 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682452 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682480 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682507 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682523 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682555 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682577 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682632 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c724g\" (UniqueName: \"kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682659 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.682697 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.784184 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.784638 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.784785 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.784910 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785033 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785153 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785344 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c724g\" (UniqueName: \"kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785459 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785587 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785708 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785818 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.785918 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.787447 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.787885 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788146 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788362 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788479 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788593 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788742 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788897 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.788936 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.796979 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.800264 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.807478 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c724g\" (UniqueName: \"kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g\") pod \"service-telemetry-operator-2-build\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:30 crc kubenswrapper[5116]: I0322 00:21:30.953801 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:21:34 crc kubenswrapper[5116]: I0322 00:21:34.792503 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 22 00:21:34 crc kubenswrapper[5116]: I0322 00:21:34.836377 5116 prober.go:120] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="ccb103ab-2a74-44b8-b853-0da2e0b4a6b5" containerName="elasticsearch" probeResult="failure" output=< Mar 22 00:21:34 crc kubenswrapper[5116]: {"timestamp": "2026-03-22T00:21:34+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 22 00:21:34 crc kubenswrapper[5116]: > Mar 22 00:21:34 crc kubenswrapper[5116]: I0322 00:21:34.878201 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-8tskc" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.533542 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-nt75l"] Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.707481 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-nt75l"] Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.707604 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.720594 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-85xrz\"" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.796778 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-bound-sa-token\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.797027 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxcdf\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-kube-api-access-gxcdf\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.901711 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-bound-sa-token\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.901785 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxcdf\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-kube-api-access-gxcdf\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.923350 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxcdf\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-kube-api-access-gxcdf\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.923614 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6d082fa-fedb-4089-87be-6bd1f0922f14-bound-sa-token\") pod \"cert-manager-759f64656b-nt75l\" (UID: \"e6d082fa-fedb-4089-87be-6bd1f0922f14\") " pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.996701 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_64d31296-d621-4ea6-9145-d61b41e2f2f8/docker-build/0.log" Mar 22 00:21:36 crc kubenswrapper[5116]: I0322 00:21:36.997217 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.025909 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-nt75l" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.105952 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.106374 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.106533 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107366 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107407 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107479 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107596 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcqmz\" (UniqueName: \"kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107651 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107696 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107791 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107870 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107898 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107925 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107966 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir\") pod \"64d31296-d621-4ea6-9145-d61b41e2f2f8\" (UID: \"64d31296-d621-4ea6-9145-d61b41e2f2f8\") " Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.107970 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108309 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108361 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108456 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108496 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108792 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108814 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108830 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108842 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108854 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108867 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108879 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/64d31296-d621-4ea6-9145-d61b41e2f2f8-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.108891 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.109180 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.149971 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz" (OuterVolumeSpecName: "kube-api-access-hcqmz") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "kube-api-access-hcqmz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.150147 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.155362 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "64d31296-d621-4ea6-9145-d61b41e2f2f8" (UID: "64d31296-d621-4ea6-9145-d61b41e2f2f8"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.211077 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.211137 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/64d31296-d621-4ea6-9145-d61b41e2f2f8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.211157 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hcqmz\" (UniqueName: \"kubernetes.io/projected/64d31296-d621-4ea6-9145-d61b41e2f2f8-kube-api-access-hcqmz\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.211189 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.211201 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/64d31296-d621-4ea6-9145-d61b41e2f2f8-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.483094 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-nt75l"] Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.643004 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_64d31296-d621-4ea6-9145-d61b41e2f2f8/docker-build/0.log" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.644239 5116 generic.go:358] "Generic (PLEG): container finished" podID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerID="0eeebb3d3c244e878a2e5436278966e18e43bd8b1746ed0cac55d3a481273273" exitCode=1 Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.644785 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"64d31296-d621-4ea6-9145-d61b41e2f2f8","Type":"ContainerDied","Data":"0eeebb3d3c244e878a2e5436278966e18e43bd8b1746ed0cac55d3a481273273"} Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.644845 5116 scope.go:117] "RemoveContainer" containerID="0eeebb3d3c244e878a2e5436278966e18e43bd8b1746ed0cac55d3a481273273" Mar 22 00:21:37 crc kubenswrapper[5116]: I0322 00:21:37.663317 5116 scope.go:117] "RemoveContainer" containerID="f3b30650b7f17897ba5577d414b6e7254548d88f736d4402f1743debfd49c680" Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.653710 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.653704 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"64d31296-d621-4ea6-9145-d61b41e2f2f8","Type":"ContainerDied","Data":"1ed3d96a6af3d62515a9799736475729996fdf1fec128d2904bdadd2196b09d8"} Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.655705 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nt75l" event={"ID":"e6d082fa-fedb-4089-87be-6bd1f0922f14","Type":"ContainerStarted","Data":"a2af18dfca18a5f1b09b814bf89f0366b28a504f7958cd3c8152c3bbb1548baf"} Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.655751 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-nt75l" event={"ID":"e6d082fa-fedb-4089-87be-6bd1f0922f14","Type":"ContainerStarted","Data":"260824329a312836d7a1180ec769c75537495b4fe0141a96f98865436688e964"} Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.657360 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerStarted","Data":"fdd7681f7def8181a3d18d083702a111b549b2b54f9a660df6b4b6ed1754f9b6"} Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.657430 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerStarted","Data":"4bff84bbfff058bf2ecd5285af8c9b30ed4caa49bd5b3b83f0e15af519300215"} Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.682844 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-nt75l" podStartSLOduration=2.682817826 podStartE2EDuration="2.682817826s" podCreationTimestamp="2026-03-22 00:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:21:38.680025356 +0000 UTC m=+769.702326769" watchObservedRunningTime="2026-03-22 00:21:38.682817826 +0000 UTC m=+769.705119209" Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.706456 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:38 crc kubenswrapper[5116]: I0322 00:21:38.717899 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 22 00:21:39 crc kubenswrapper[5116]: I0322 00:21:39.707201 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" path="/var/lib/kubelet/pods/64d31296-d621-4ea6-9145-d61b41e2f2f8/volumes" Mar 22 00:21:40 crc kubenswrapper[5116]: I0322 00:21:40.555198 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 22 00:21:47 crc kubenswrapper[5116]: I0322 00:21:47.740022 5116 generic.go:358] "Generic (PLEG): container finished" podID="b5b6453f-08b2-400a-8db6-b042778914e1" containerID="fdd7681f7def8181a3d18d083702a111b549b2b54f9a660df6b4b6ed1754f9b6" exitCode=0 Mar 22 00:21:47 crc kubenswrapper[5116]: I0322 00:21:47.740202 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerDied","Data":"fdd7681f7def8181a3d18d083702a111b549b2b54f9a660df6b4b6ed1754f9b6"} Mar 22 00:21:48 crc kubenswrapper[5116]: I0322 00:21:48.753406 5116 generic.go:358] "Generic (PLEG): container finished" podID="b5b6453f-08b2-400a-8db6-b042778914e1" containerID="aebfeeceb4efdd1c94ed085dd3b8ce981a1f74150b4b0a287a7cc662eb1c1989" exitCode=0 Mar 22 00:21:48 crc kubenswrapper[5116]: I0322 00:21:48.753583 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerDied","Data":"aebfeeceb4efdd1c94ed085dd3b8ce981a1f74150b4b0a287a7cc662eb1c1989"} Mar 22 00:21:48 crc kubenswrapper[5116]: I0322 00:21:48.796299 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_b5b6453f-08b2-400a-8db6-b042778914e1/manage-dockerfile/0.log" Mar 22 00:21:49 crc kubenswrapper[5116]: I0322 00:21:49.769593 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerStarted","Data":"9f80dd160f49863da97b4679eccf90a67796405074ac221d7cb0f7d3175f7f14"} Mar 22 00:21:49 crc kubenswrapper[5116]: I0322 00:21:49.802026 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=19.802006586 podStartE2EDuration="19.802006586s" podCreationTimestamp="2026-03-22 00:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:21:49.79529372 +0000 UTC m=+780.817595103" watchObservedRunningTime="2026-03-22 00:21:49.802006586 +0000 UTC m=+780.824307979" Mar 22 00:21:53 crc kubenswrapper[5116]: I0322 00:21:53.056948 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:21:53 crc kubenswrapper[5116]: I0322 00:21:53.057088 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.149343 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568982-7k9wz"] Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.150614 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="docker-build" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.150629 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="docker-build" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.150661 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="manage-dockerfile" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.150671 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="manage-dockerfile" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.150800 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="64d31296-d621-4ea6-9145-d61b41e2f2f8" containerName="docker-build" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.323495 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568982-7k9wz"] Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.323565 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.326076 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.326637 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.327324 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.471456 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwpfq\" (UniqueName: \"kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq\") pod \"auto-csr-approver-29568982-7k9wz\" (UID: \"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5\") " pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.572523 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwpfq\" (UniqueName: \"kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq\") pod \"auto-csr-approver-29568982-7k9wz\" (UID: \"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5\") " pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.594667 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwpfq\" (UniqueName: \"kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq\") pod \"auto-csr-approver-29568982-7k9wz\" (UID: \"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5\") " pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.642682 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:00 crc kubenswrapper[5116]: W0322 00:22:00.905659 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26bdb492_c2c3_48e7_b86a_b83cb2f4aea5.slice/crio-a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6 WatchSource:0}: Error finding container a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6: Status 404 returned error can't find the container with id a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6 Mar 22 00:22:00 crc kubenswrapper[5116]: I0322 00:22:00.905820 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568982-7k9wz"] Mar 22 00:22:01 crc kubenswrapper[5116]: I0322 00:22:01.878766 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" event={"ID":"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5","Type":"ContainerStarted","Data":"a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6"} Mar 22 00:22:02 crc kubenswrapper[5116]: I0322 00:22:02.887811 5116 generic.go:358] "Generic (PLEG): container finished" podID="26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" containerID="2ddc5261a5a71c38bc0367d296e6678e1f3648ba5a05ce953a8407e9e3ce8a74" exitCode=0 Mar 22 00:22:02 crc kubenswrapper[5116]: I0322 00:22:02.887885 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" event={"ID":"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5","Type":"ContainerDied","Data":"2ddc5261a5a71c38bc0367d296e6678e1f3648ba5a05ce953a8407e9e3ce8a74"} Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.160931 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.228392 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwpfq\" (UniqueName: \"kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq\") pod \"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5\" (UID: \"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5\") " Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.235908 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq" (OuterVolumeSpecName: "kube-api-access-zwpfq") pod "26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" (UID: "26bdb492-c2c3-48e7-b86a-b83cb2f4aea5"). InnerVolumeSpecName "kube-api-access-zwpfq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.329642 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zwpfq\" (UniqueName: \"kubernetes.io/projected/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5-kube-api-access-zwpfq\") on node \"crc\" DevicePath \"\"" Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.904334 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" event={"ID":"26bdb492-c2c3-48e7-b86a-b83cb2f4aea5","Type":"ContainerDied","Data":"a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6"} Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.904386 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b54c8c3de25a3ab435a904b2a7eb72157c27770de39ea6ea2638fac5302db6" Mar 22 00:22:04 crc kubenswrapper[5116]: I0322 00:22:04.905073 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568982-7k9wz" Mar 22 00:22:05 crc kubenswrapper[5116]: I0322 00:22:05.239994 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568976-b4g9d"] Mar 22 00:22:05 crc kubenswrapper[5116]: I0322 00:22:05.245963 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568976-b4g9d"] Mar 22 00:22:05 crc kubenswrapper[5116]: I0322 00:22:05.706479 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="832c911e-4692-4912-8df4-880e98e4c2c1" path="/var/lib/kubelet/pods/832c911e-4692-4912-8df4-880e98e4c2c1/volumes" Mar 22 00:22:23 crc kubenswrapper[5116]: I0322 00:22:23.057525 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:22:23 crc kubenswrapper[5116]: I0322 00:22:23.058235 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:22:23 crc kubenswrapper[5116]: I0322 00:22:23.058325 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:22:23 crc kubenswrapper[5116]: I0322 00:22:23.059391 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:22:23 crc kubenswrapper[5116]: I0322 00:22:23.059516 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2" gracePeriod=600 Mar 22 00:22:24 crc kubenswrapper[5116]: I0322 00:22:24.194423 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2" exitCode=0 Mar 22 00:22:24 crc kubenswrapper[5116]: I0322 00:22:24.194515 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2"} Mar 22 00:22:24 crc kubenswrapper[5116]: I0322 00:22:24.195458 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029"} Mar 22 00:22:24 crc kubenswrapper[5116]: I0322 00:22:24.195486 5116 scope.go:117] "RemoveContainer" containerID="29fe2544ad2992d4d45322cba6cd2af82b6ecb0aef833ef494e780ff8287385e" Mar 22 00:22:50 crc kubenswrapper[5116]: I0322 00:22:50.420772 5116 scope.go:117] "RemoveContainer" containerID="9cfe6ad0080f9bd011bb482561dcac74a7fc0e16adff6a8d4fce7c2e783aaf6b" Mar 22 00:23:14 crc kubenswrapper[5116]: I0322 00:23:14.565970 5116 generic.go:358] "Generic (PLEG): container finished" podID="b5b6453f-08b2-400a-8db6-b042778914e1" containerID="9f80dd160f49863da97b4679eccf90a67796405074ac221d7cb0f7d3175f7f14" exitCode=0 Mar 22 00:23:14 crc kubenswrapper[5116]: I0322 00:23:14.566034 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerDied","Data":"9f80dd160f49863da97b4679eccf90a67796405074ac221d7cb0f7d3175f7f14"} Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.833502 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.962902 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.962969 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.962991 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963011 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963028 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c724g\" (UniqueName: \"kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963069 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963101 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963144 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963159 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963218 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963215 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963313 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963331 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run\") pod \"b5b6453f-08b2-400a-8db6-b042778914e1\" (UID: \"b5b6453f-08b2-400a-8db6-b042778914e1\") " Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963529 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.963994 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.964496 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.964893 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.964939 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.965419 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.968988 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g" (OuterVolumeSpecName: "kube-api-access-c724g") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "kube-api-access-c724g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.969343 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.971205 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:23:15 crc kubenswrapper[5116]: I0322 00:23:15.997683 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064534 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064581 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064600 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064616 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/b5b6453f-08b2-400a-8db6-b042778914e1-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064627 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c724g\" (UniqueName: \"kubernetes.io/projected/b5b6453f-08b2-400a-8db6-b042778914e1-kube-api-access-c724g\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064640 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064652 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064664 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b5b6453f-08b2-400a-8db6-b042778914e1-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.064676 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b5b6453f-08b2-400a-8db6-b042778914e1-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.124081 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.165738 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.584555 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"b5b6453f-08b2-400a-8db6-b042778914e1","Type":"ContainerDied","Data":"4bff84bbfff058bf2ecd5285af8c9b30ed4caa49bd5b3b83f0e15af519300215"} Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.584978 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bff84bbfff058bf2ecd5285af8c9b30ed4caa49bd5b3b83f0e15af519300215" Mar 22 00:23:16 crc kubenswrapper[5116]: I0322 00:23:16.584586 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 22 00:23:17 crc kubenswrapper[5116]: I0322 00:23:17.968635 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "b5b6453f-08b2-400a-8db6-b042778914e1" (UID: "b5b6453f-08b2-400a-8db6-b042778914e1"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:17 crc kubenswrapper[5116]: I0322 00:23:17.995585 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b5b6453f-08b2-400a-8db6-b042778914e1-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.565610 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566700 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="git-clone" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566716 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="git-clone" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566734 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="docker-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566739 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="docker-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566752 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" containerName="oc" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566758 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" containerName="oc" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566770 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="manage-dockerfile" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566775 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="manage-dockerfile" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566882 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b6453f-08b2-400a-8db6-b042778914e1" containerName="docker-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.566893 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" containerName="oc" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.841275 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.841609 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.844046 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-1-global-ca\"" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.844305 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-1-sys-config\"" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.844615 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.845815 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-1-ca\"" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.939802 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.939866 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.939896 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.939914 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm884\" (UniqueName: \"kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.939978 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940028 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940130 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940269 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940308 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940380 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940433 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:20 crc kubenswrapper[5116]: I0322 00:23:20.940470 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041404 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041468 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041509 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041531 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vm884\" (UniqueName: \"kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041553 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041586 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041614 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041626 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041648 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.041881 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042263 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042428 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042453 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042747 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042619 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042801 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.042784 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.043484 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.043519 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.043542 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.049713 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.049749 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.061970 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm884\" (UniqueName: \"kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884\") pod \"smart-gateway-operator-1-build\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.165310 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.585077 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:21 crc kubenswrapper[5116]: I0322 00:23:21.623832 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"5e320997-981e-4e89-992a-8ad6f1f6099a","Type":"ContainerStarted","Data":"217c472cfa6d0d26fe99bb6287f96ab85531e90f295a41a345b695b66937057a"} Mar 22 00:23:22 crc kubenswrapper[5116]: I0322 00:23:22.632388 5116 generic.go:358] "Generic (PLEG): container finished" podID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerID="4fc1945949f4a48bc881b27cfbe1bdcda0b53a37ec4eb1a6d836aed83950ab03" exitCode=0 Mar 22 00:23:22 crc kubenswrapper[5116]: I0322 00:23:22.632487 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"5e320997-981e-4e89-992a-8ad6f1f6099a","Type":"ContainerDied","Data":"4fc1945949f4a48bc881b27cfbe1bdcda0b53a37ec4eb1a6d836aed83950ab03"} Mar 22 00:23:23 crc kubenswrapper[5116]: I0322 00:23:23.642728 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"5e320997-981e-4e89-992a-8ad6f1f6099a","Type":"ContainerStarted","Data":"6bf6460e54958e2605944c6c4c0ee5bdf476b98e4d7da44d036a7cf3c7d4a782"} Mar 22 00:23:23 crc kubenswrapper[5116]: I0322 00:23:23.674678 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.674658504 podStartE2EDuration="3.674658504s" podCreationTimestamp="2026-03-22 00:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:23:23.671671907 +0000 UTC m=+874.693973310" watchObservedRunningTime="2026-03-22 00:23:23.674658504 +0000 UTC m=+874.696959887" Mar 22 00:23:31 crc kubenswrapper[5116]: I0322 00:23:31.192045 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:31 crc kubenswrapper[5116]: I0322 00:23:31.193292 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="docker-build" containerID="cri-o://6bf6460e54958e2605944c6c4c0ee5bdf476b98e4d7da44d036a7cf3c7d4a782" gracePeriod=30 Mar 22 00:23:32 crc kubenswrapper[5116]: I0322 00:23:32.823237 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.268543 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.268745 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.270950 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-2-sys-config\"" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.271007 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-2-global-ca\"" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.271218 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-2-ca\"" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311113 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311318 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311347 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311469 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311542 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311639 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311672 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2lsb\" (UniqueName: \"kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311710 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311744 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311814 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311876 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.311961 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415763 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415819 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415844 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415892 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415912 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415931 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415955 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415973 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.415995 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416017 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2lsb\" (UniqueName: \"kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416044 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416304 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416343 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416386 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416597 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416680 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416682 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416611 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416823 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.416888 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.417865 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.421518 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.421543 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.434105 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2lsb\" (UniqueName: \"kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb\") pod \"smart-gateway-operator-2-build\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.587017 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.727387 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_5e320997-981e-4e89-992a-8ad6f1f6099a/docker-build/0.log" Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.729228 5116 generic.go:358] "Generic (PLEG): container finished" podID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerID="6bf6460e54958e2605944c6c4c0ee5bdf476b98e4d7da44d036a7cf3c7d4a782" exitCode=1 Mar 22 00:23:33 crc kubenswrapper[5116]: I0322 00:23:33.729783 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"5e320997-981e-4e89-992a-8ad6f1f6099a","Type":"ContainerDied","Data":"6bf6460e54958e2605944c6c4c0ee5bdf476b98e4d7da44d036a7cf3c7d4a782"} Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.031231 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.253101 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_5e320997-981e-4e89-992a-8ad6f1f6099a/docker-build/0.log" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.253548 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.332419 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.332885 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.332968 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333008 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333053 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333070 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333105 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333140 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333724 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333793 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333825 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333898 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm884\" (UniqueName: \"kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333936 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333974 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root\") pod \"5e320997-981e-4e89-992a-8ad6f1f6099a\" (UID: \"5e320997-981e-4e89-992a-8ad6f1f6099a\") " Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.334467 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.334483 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5e320997-981e-4e89-992a-8ad6f1f6099a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.333969 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.334584 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.334856 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.335188 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.335431 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.335862 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.339033 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.339066 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.339316 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884" (OuterVolumeSpecName: "kube-api-access-vm884") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "kube-api-access-vm884". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436066 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436109 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436124 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436136 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5e320997-981e-4e89-992a-8ad6f1f6099a-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436151 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436179 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vm884\" (UniqueName: \"kubernetes.io/projected/5e320997-981e-4e89-992a-8ad6f1f6099a-kube-api-access-vm884\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436194 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436205 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.436216 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5e320997-981e-4e89-992a-8ad6f1f6099a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.478669 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5e320997-981e-4e89-992a-8ad6f1f6099a" (UID: "5e320997-981e-4e89-992a-8ad6f1f6099a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.537065 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5e320997-981e-4e89-992a-8ad6f1f6099a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.749922 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerStarted","Data":"d347e1d1d0bfa441475d985bf9ba23296fad7e939d0b5cd81d6c7a37f1fc4343"} Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.749994 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerStarted","Data":"ee2acea936ad827175b72459e0f3f311a25df509cce538719e55f747534cdcb7"} Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.755115 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_5e320997-981e-4e89-992a-8ad6f1f6099a/docker-build/0.log" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.755948 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"5e320997-981e-4e89-992a-8ad6f1f6099a","Type":"ContainerDied","Data":"217c472cfa6d0d26fe99bb6287f96ab85531e90f295a41a345b695b66937057a"} Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.755959 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.756046 5116 scope.go:117] "RemoveContainer" containerID="6bf6460e54958e2605944c6c4c0ee5bdf476b98e4d7da44d036a7cf3c7d4a782" Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.839338 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.848053 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 22 00:23:34 crc kubenswrapper[5116]: I0322 00:23:34.853459 5116 scope.go:117] "RemoveContainer" containerID="4fc1945949f4a48bc881b27cfbe1bdcda0b53a37ec4eb1a6d836aed83950ab03" Mar 22 00:23:35 crc kubenswrapper[5116]: I0322 00:23:35.704314 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" path="/var/lib/kubelet/pods/5e320997-981e-4e89-992a-8ad6f1f6099a/volumes" Mar 22 00:23:35 crc kubenswrapper[5116]: I0322 00:23:35.763035 5116 generic.go:358] "Generic (PLEG): container finished" podID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerID="d347e1d1d0bfa441475d985bf9ba23296fad7e939d0b5cd81d6c7a37f1fc4343" exitCode=0 Mar 22 00:23:35 crc kubenswrapper[5116]: I0322 00:23:35.763133 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerDied","Data":"d347e1d1d0bfa441475d985bf9ba23296fad7e939d0b5cd81d6c7a37f1fc4343"} Mar 22 00:23:36 crc kubenswrapper[5116]: I0322 00:23:36.774219 5116 generic.go:358] "Generic (PLEG): container finished" podID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerID="2f35ca229ed42938e875630eca11bd2baec770d2b9a943980d5198c7d2dd43fb" exitCode=0 Mar 22 00:23:36 crc kubenswrapper[5116]: I0322 00:23:36.774340 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerDied","Data":"2f35ca229ed42938e875630eca11bd2baec770d2b9a943980d5198c7d2dd43fb"} Mar 22 00:23:36 crc kubenswrapper[5116]: I0322 00:23:36.808498 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_8a3c0606-0188-415f-9d3f-b6477cda110e/manage-dockerfile/0.log" Mar 22 00:23:37 crc kubenswrapper[5116]: I0322 00:23:37.785694 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerStarted","Data":"3a936d4064aa62dbd3538b2212d49894af70791a2665eec39925b6255ecb5c6b"} Mar 22 00:23:37 crc kubenswrapper[5116]: I0322 00:23:37.819006 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=5.818983162 podStartE2EDuration="5.818983162s" podCreationTimestamp="2026-03-22 00:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:23:37.817090879 +0000 UTC m=+888.839392252" watchObservedRunningTime="2026-03-22 00:23:37.818983162 +0000 UTC m=+888.841284565" Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.710198 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.711658 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="docker-build" Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.711677 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="docker-build" Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.711695 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="manage-dockerfile" Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.711704 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="manage-dockerfile" Mar 22 00:23:43 crc kubenswrapper[5116]: I0322 00:23:43.711861 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e320997-981e-4e89-992a-8ad6f1f6099a" containerName="docker-build" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.560000 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.560240 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.692332 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.692415 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.692517 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lbm8\" (UniqueName: \"kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.793953 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.794141 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbm8\" (UniqueName: \"kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.794772 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.794923 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.795398 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.817460 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lbm8\" (UniqueName: \"kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8\") pod \"community-operators-wmsdh\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:44 crc kubenswrapper[5116]: I0322 00:23:44.892552 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:45 crc kubenswrapper[5116]: I0322 00:23:45.129148 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:45 crc kubenswrapper[5116]: I0322 00:23:45.835157 5116 generic.go:358] "Generic (PLEG): container finished" podID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerID="a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23" exitCode=0 Mar 22 00:23:45 crc kubenswrapper[5116]: I0322 00:23:45.835374 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerDied","Data":"a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23"} Mar 22 00:23:45 crc kubenswrapper[5116]: I0322 00:23:45.835400 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerStarted","Data":"1e631c427ba65a808a4fa4d0e57e810ac41d1ca777f6d8d7f029e3a9a37fd05a"} Mar 22 00:23:47 crc kubenswrapper[5116]: I0322 00:23:47.853259 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerStarted","Data":"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a"} Mar 22 00:23:48 crc kubenswrapper[5116]: I0322 00:23:48.861628 5116 generic.go:358] "Generic (PLEG): container finished" podID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerID="0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a" exitCode=0 Mar 22 00:23:48 crc kubenswrapper[5116]: I0322 00:23:48.861740 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerDied","Data":"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a"} Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.868514 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerStarted","Data":"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6"} Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.891480 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmsdh" podStartSLOduration=6.197640949 podStartE2EDuration="6.891464091s" podCreationTimestamp="2026-03-22 00:23:43 +0000 UTC" firstStartedPulling="2026-03-22 00:23:46.844407941 +0000 UTC m=+897.866709314" lastFinishedPulling="2026-03-22 00:23:47.538231083 +0000 UTC m=+898.560532456" observedRunningTime="2026-03-22 00:23:49.88745187 +0000 UTC m=+900.909753243" watchObservedRunningTime="2026-03-22 00:23:49.891464091 +0000 UTC m=+900.913765464" Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.980621 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.982948 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.988806 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:23:49 crc kubenswrapper[5116]: I0322 00:23:49.989785 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:23:54 crc kubenswrapper[5116]: I0322 00:23:54.893283 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:54 crc kubenswrapper[5116]: I0322 00:23:54.893783 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:54 crc kubenswrapper[5116]: I0322 00:23:54.945189 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:54 crc kubenswrapper[5116]: I0322 00:23:54.978612 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:55 crc kubenswrapper[5116]: I0322 00:23:55.177017 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:56 crc kubenswrapper[5116]: I0322 00:23:56.924906 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wmsdh" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="registry-server" containerID="cri-o://ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6" gracePeriod=2 Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.324005 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.357949 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lbm8\" (UniqueName: \"kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8\") pod \"7644c42a-78ae-476f-84f8-2f1d9372f921\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.358060 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content\") pod \"7644c42a-78ae-476f-84f8-2f1d9372f921\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.358243 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities\") pod \"7644c42a-78ae-476f-84f8-2f1d9372f921\" (UID: \"7644c42a-78ae-476f-84f8-2f1d9372f921\") " Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.359156 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities" (OuterVolumeSpecName: "utilities") pod "7644c42a-78ae-476f-84f8-2f1d9372f921" (UID: "7644c42a-78ae-476f-84f8-2f1d9372f921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.364670 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8" (OuterVolumeSpecName: "kube-api-access-7lbm8") pod "7644c42a-78ae-476f-84f8-2f1d9372f921" (UID: "7644c42a-78ae-476f-84f8-2f1d9372f921"). InnerVolumeSpecName "kube-api-access-7lbm8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.413567 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7644c42a-78ae-476f-84f8-2f1d9372f921" (UID: "7644c42a-78ae-476f-84f8-2f1d9372f921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.460107 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7lbm8\" (UniqueName: \"kubernetes.io/projected/7644c42a-78ae-476f-84f8-2f1d9372f921-kube-api-access-7lbm8\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.460147 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.460159 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7644c42a-78ae-476f-84f8-2f1d9372f921-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.786278 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787675 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="registry-server" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787697 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="registry-server" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787716 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="extract-utilities" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787723 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="extract-utilities" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787736 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="extract-content" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787745 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="extract-content" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.787941 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerName="registry-server" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.804694 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.804839 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.865696 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.865804 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4qqh\" (UniqueName: \"kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.865868 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.951405 5116 generic.go:358] "Generic (PLEG): container finished" podID="7644c42a-78ae-476f-84f8-2f1d9372f921" containerID="ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6" exitCode=0 Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.951516 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmsdh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.951776 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerDied","Data":"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6"} Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.951865 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmsdh" event={"ID":"7644c42a-78ae-476f-84f8-2f1d9372f921","Type":"ContainerDied","Data":"1e631c427ba65a808a4fa4d0e57e810ac41d1ca777f6d8d7f029e3a9a37fd05a"} Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.951889 5116 scope.go:117] "RemoveContainer" containerID="ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.967578 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.967691 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.967713 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4qqh\" (UniqueName: \"kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.968262 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.971705 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.978199 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.980252 5116 scope.go:117] "RemoveContainer" containerID="0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.984042 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wmsdh"] Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.990084 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4qqh\" (UniqueName: \"kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh\") pod \"certified-operators-42pwh\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:57 crc kubenswrapper[5116]: I0322 00:23:57.998019 5116 scope.go:117] "RemoveContainer" containerID="a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.023986 5116 scope.go:117] "RemoveContainer" containerID="ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6" Mar 22 00:23:58 crc kubenswrapper[5116]: E0322 00:23:58.024563 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6\": container with ID starting with ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6 not found: ID does not exist" containerID="ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.024605 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6"} err="failed to get container status \"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6\": rpc error: code = NotFound desc = could not find container \"ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6\": container with ID starting with ebe38c553f20e90aa2b7248def912252567dc730fa564a49bdbd668cec1574f6 not found: ID does not exist" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.024631 5116 scope.go:117] "RemoveContainer" containerID="0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a" Mar 22 00:23:58 crc kubenswrapper[5116]: E0322 00:23:58.024957 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a\": container with ID starting with 0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a not found: ID does not exist" containerID="0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.024988 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a"} err="failed to get container status \"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a\": rpc error: code = NotFound desc = could not find container \"0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a\": container with ID starting with 0de476c313957e9bfebcc63b096215d58335b13c38937717bbd2cebe12d00e2a not found: ID does not exist" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.025007 5116 scope.go:117] "RemoveContainer" containerID="a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23" Mar 22 00:23:58 crc kubenswrapper[5116]: E0322 00:23:58.025267 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23\": container with ID starting with a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23 not found: ID does not exist" containerID="a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.025285 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23"} err="failed to get container status \"a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23\": rpc error: code = NotFound desc = could not find container \"a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23\": container with ID starting with a9b82b5e1637db60c44ebfe672bf8988270a6be31331481f933ad65e5b492b23 not found: ID does not exist" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.145746 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.583576 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:23:58 crc kubenswrapper[5116]: W0322 00:23:58.597300 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49647030_2576_4187_ba7e_d7514161f53d.slice/crio-25f386cffffaef0016eea677bfdd2ee1dc93c0140cb90fd72fcaf4a608c0464a WatchSource:0}: Error finding container 25f386cffffaef0016eea677bfdd2ee1dc93c0140cb90fd72fcaf4a608c0464a: Status 404 returned error can't find the container with id 25f386cffffaef0016eea677bfdd2ee1dc93c0140cb90fd72fcaf4a608c0464a Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.599103 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.960875 5116 generic.go:358] "Generic (PLEG): container finished" podID="49647030-2576-4187-ba7e-d7514161f53d" containerID="54526cfea93342666afe65f2e74e59e37e1224b18ad48bbbbfef34ea25a85af6" exitCode=0 Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.960980 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerDied","Data":"54526cfea93342666afe65f2e74e59e37e1224b18ad48bbbbfef34ea25a85af6"} Mar 22 00:23:58 crc kubenswrapper[5116]: I0322 00:23:58.961393 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerStarted","Data":"25f386cffffaef0016eea677bfdd2ee1dc93c0140cb90fd72fcaf4a608c0464a"} Mar 22 00:23:59 crc kubenswrapper[5116]: I0322 00:23:59.709899 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7644c42a-78ae-476f-84f8-2f1d9372f921" path="/var/lib/kubelet/pods/7644c42a-78ae-476f-84f8-2f1d9372f921/volumes" Mar 22 00:23:59 crc kubenswrapper[5116]: I0322 00:23:59.972669 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerStarted","Data":"2b2a78638fc6aafc7314ab3d3c208dcc214f4792249aca96fd548ab4f628b429"} Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.135159 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568984-7vpl2"] Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.141325 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.143227 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.143606 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.144008 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.152225 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568984-7vpl2"] Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.201608 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt5rx\" (UniqueName: \"kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx\") pod \"auto-csr-approver-29568984-7vpl2\" (UID: \"dd880bf8-6058-4924-8268-c4cdcd44bdcf\") " pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.303134 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt5rx\" (UniqueName: \"kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx\") pod \"auto-csr-approver-29568984-7vpl2\" (UID: \"dd880bf8-6058-4924-8268-c4cdcd44bdcf\") " pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.321614 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt5rx\" (UniqueName: \"kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx\") pod \"auto-csr-approver-29568984-7vpl2\" (UID: \"dd880bf8-6058-4924-8268-c4cdcd44bdcf\") " pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.505841 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.698774 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568984-7vpl2"] Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.981082 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" event={"ID":"dd880bf8-6058-4924-8268-c4cdcd44bdcf","Type":"ContainerStarted","Data":"2efa21c0a91293e007e8e38af7c5a1638c6460f90d946780cdea26aa0fa32a0d"} Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.982806 5116 generic.go:358] "Generic (PLEG): container finished" podID="49647030-2576-4187-ba7e-d7514161f53d" containerID="2b2a78638fc6aafc7314ab3d3c208dcc214f4792249aca96fd548ab4f628b429" exitCode=0 Mar 22 00:24:00 crc kubenswrapper[5116]: I0322 00:24:00.982974 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerDied","Data":"2b2a78638fc6aafc7314ab3d3c208dcc214f4792249aca96fd548ab4f628b429"} Mar 22 00:24:02 crc kubenswrapper[5116]: I0322 00:24:02.007718 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerStarted","Data":"f8afc86015a284985c9b4eb7a38cee7fb97abb04ba3be621c6437a9078f6747c"} Mar 22 00:24:02 crc kubenswrapper[5116]: I0322 00:24:02.011316 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" event={"ID":"dd880bf8-6058-4924-8268-c4cdcd44bdcf","Type":"ContainerStarted","Data":"c55d0c630755e42e331267ab717de759a85e99a7760f057cab2cc5e7dd612af4"} Mar 22 00:24:02 crc kubenswrapper[5116]: I0322 00:24:02.038645 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" podStartSLOduration=1.091766324 podStartE2EDuration="2.038628385s" podCreationTimestamp="2026-03-22 00:24:00 +0000 UTC" firstStartedPulling="2026-03-22 00:24:00.706244446 +0000 UTC m=+911.728545819" lastFinishedPulling="2026-03-22 00:24:01.653106497 +0000 UTC m=+912.675407880" observedRunningTime="2026-03-22 00:24:02.037994064 +0000 UTC m=+913.060295437" watchObservedRunningTime="2026-03-22 00:24:02.038628385 +0000 UTC m=+913.060929748" Mar 22 00:24:02 crc kubenswrapper[5116]: I0322 00:24:02.043072 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-42pwh" podStartSLOduration=4.214725348 podStartE2EDuration="5.043056461s" podCreationTimestamp="2026-03-22 00:23:57 +0000 UTC" firstStartedPulling="2026-03-22 00:23:58.96272189 +0000 UTC m=+909.985023303" lastFinishedPulling="2026-03-22 00:23:59.791053033 +0000 UTC m=+910.813354416" observedRunningTime="2026-03-22 00:24:02.025256253 +0000 UTC m=+913.047557636" watchObservedRunningTime="2026-03-22 00:24:02.043056461 +0000 UTC m=+913.065357834" Mar 22 00:24:03 crc kubenswrapper[5116]: I0322 00:24:03.019972 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" event={"ID":"dd880bf8-6058-4924-8268-c4cdcd44bdcf","Type":"ContainerDied","Data":"c55d0c630755e42e331267ab717de759a85e99a7760f057cab2cc5e7dd612af4"} Mar 22 00:24:03 crc kubenswrapper[5116]: I0322 00:24:03.019874 5116 generic.go:358] "Generic (PLEG): container finished" podID="dd880bf8-6058-4924-8268-c4cdcd44bdcf" containerID="c55d0c630755e42e331267ab717de759a85e99a7760f057cab2cc5e7dd612af4" exitCode=0 Mar 22 00:24:04 crc kubenswrapper[5116]: I0322 00:24:04.316038 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:04 crc kubenswrapper[5116]: I0322 00:24:04.461356 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt5rx\" (UniqueName: \"kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx\") pod \"dd880bf8-6058-4924-8268-c4cdcd44bdcf\" (UID: \"dd880bf8-6058-4924-8268-c4cdcd44bdcf\") " Mar 22 00:24:04 crc kubenswrapper[5116]: I0322 00:24:04.470543 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx" (OuterVolumeSpecName: "kube-api-access-mt5rx") pod "dd880bf8-6058-4924-8268-c4cdcd44bdcf" (UID: "dd880bf8-6058-4924-8268-c4cdcd44bdcf"). InnerVolumeSpecName "kube-api-access-mt5rx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:24:04 crc kubenswrapper[5116]: I0322 00:24:04.566460 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mt5rx\" (UniqueName: \"kubernetes.io/projected/dd880bf8-6058-4924-8268-c4cdcd44bdcf-kube-api-access-mt5rx\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.044206 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" event={"ID":"dd880bf8-6058-4924-8268-c4cdcd44bdcf","Type":"ContainerDied","Data":"2efa21c0a91293e007e8e38af7c5a1638c6460f90d946780cdea26aa0fa32a0d"} Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.044263 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568984-7vpl2" Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.044276 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2efa21c0a91293e007e8e38af7c5a1638c6460f90d946780cdea26aa0fa32a0d" Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.390195 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568978-vkbll"] Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.396334 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568978-vkbll"] Mar 22 00:24:05 crc kubenswrapper[5116]: I0322 00:24:05.708808 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ada1f6-f713-45cb-8230-b9a2d89878ab" path="/var/lib/kubelet/pods/07ada1f6-f713-45cb-8230-b9a2d89878ab/volumes" Mar 22 00:24:08 crc kubenswrapper[5116]: I0322 00:24:08.146092 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:08 crc kubenswrapper[5116]: I0322 00:24:08.147050 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:08 crc kubenswrapper[5116]: I0322 00:24:08.197682 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:09 crc kubenswrapper[5116]: I0322 00:24:09.114823 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:09 crc kubenswrapper[5116]: I0322 00:24:09.166369 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:24:11 crc kubenswrapper[5116]: I0322 00:24:11.085811 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-42pwh" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="registry-server" containerID="cri-o://f8afc86015a284985c9b4eb7a38cee7fb97abb04ba3be621c6437a9078f6747c" gracePeriod=2 Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.096231 5116 generic.go:358] "Generic (PLEG): container finished" podID="49647030-2576-4187-ba7e-d7514161f53d" containerID="f8afc86015a284985c9b4eb7a38cee7fb97abb04ba3be621c6437a9078f6747c" exitCode=0 Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.096441 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerDied","Data":"f8afc86015a284985c9b4eb7a38cee7fb97abb04ba3be621c6437a9078f6747c"} Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.552129 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.596161 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4qqh\" (UniqueName: \"kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh\") pod \"49647030-2576-4187-ba7e-d7514161f53d\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.596258 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities\") pod \"49647030-2576-4187-ba7e-d7514161f53d\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.596308 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content\") pod \"49647030-2576-4187-ba7e-d7514161f53d\" (UID: \"49647030-2576-4187-ba7e-d7514161f53d\") " Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.598192 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities" (OuterVolumeSpecName: "utilities") pod "49647030-2576-4187-ba7e-d7514161f53d" (UID: "49647030-2576-4187-ba7e-d7514161f53d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.606796 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh" (OuterVolumeSpecName: "kube-api-access-j4qqh") pod "49647030-2576-4187-ba7e-d7514161f53d" (UID: "49647030-2576-4187-ba7e-d7514161f53d"). InnerVolumeSpecName "kube-api-access-j4qqh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.637548 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49647030-2576-4187-ba7e-d7514161f53d" (UID: "49647030-2576-4187-ba7e-d7514161f53d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.697196 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.697261 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49647030-2576-4187-ba7e-d7514161f53d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:12 crc kubenswrapper[5116]: I0322 00:24:12.697284 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4qqh\" (UniqueName: \"kubernetes.io/projected/49647030-2576-4187-ba7e-d7514161f53d-kube-api-access-j4qqh\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.107786 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-42pwh" event={"ID":"49647030-2576-4187-ba7e-d7514161f53d","Type":"ContainerDied","Data":"25f386cffffaef0016eea677bfdd2ee1dc93c0140cb90fd72fcaf4a608c0464a"} Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.108591 5116 scope.go:117] "RemoveContainer" containerID="f8afc86015a284985c9b4eb7a38cee7fb97abb04ba3be621c6437a9078f6747c" Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.107900 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-42pwh" Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.130737 5116 scope.go:117] "RemoveContainer" containerID="2b2a78638fc6aafc7314ab3d3c208dcc214f4792249aca96fd548ab4f628b429" Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.147984 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.149990 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-42pwh"] Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.174258 5116 scope.go:117] "RemoveContainer" containerID="54526cfea93342666afe65f2e74e59e37e1224b18ad48bbbbfef34ea25a85af6" Mar 22 00:24:13 crc kubenswrapper[5116]: I0322 00:24:13.712146 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49647030-2576-4187-ba7e-d7514161f53d" path="/var/lib/kubelet/pods/49647030-2576-4187-ba7e-d7514161f53d/volumes" Mar 22 00:24:23 crc kubenswrapper[5116]: I0322 00:24:23.057503 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:24:23 crc kubenswrapper[5116]: I0322 00:24:23.057941 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:24:41 crc kubenswrapper[5116]: I0322 00:24:41.293127 5116 generic.go:358] "Generic (PLEG): container finished" podID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerID="3a936d4064aa62dbd3538b2212d49894af70791a2665eec39925b6255ecb5c6b" exitCode=0 Mar 22 00:24:41 crc kubenswrapper[5116]: I0322 00:24:41.293183 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerDied","Data":"3a936d4064aa62dbd3538b2212d49894af70791a2665eec39925b6255ecb5c6b"} Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.569519 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638297 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638363 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638396 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638417 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638434 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638450 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638534 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2lsb\" (UniqueName: \"kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638577 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638597 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638617 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638635 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.638675 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache\") pod \"8a3c0606-0188-415f-9d3f-b6477cda110e\" (UID: \"8a3c0606-0188-415f-9d3f-b6477cda110e\") " Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.639379 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.639436 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.640416 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.640490 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.640530 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.642147 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.644609 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.647304 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.647322 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.647357 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb" (OuterVolumeSpecName: "kube-api-access-n2lsb") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "kube-api-access-n2lsb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.739760 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.739977 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740040 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740099 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2lsb\" (UniqueName: \"kubernetes.io/projected/8a3c0606-0188-415f-9d3f-b6477cda110e-kube-api-access-n2lsb\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740156 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740249 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8a3c0606-0188-415f-9d3f-b6477cda110e-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740312 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8a3c0606-0188-415f-9d3f-b6477cda110e-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740370 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740428 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/8a3c0606-0188-415f-9d3f-b6477cda110e-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.740484 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.815423 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:42 crc kubenswrapper[5116]: I0322 00:24:42.841628 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:43 crc kubenswrapper[5116]: I0322 00:24:43.312999 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8a3c0606-0188-415f-9d3f-b6477cda110e","Type":"ContainerDied","Data":"ee2acea936ad827175b72459e0f3f311a25df509cce538719e55f747534cdcb7"} Mar 22 00:24:43 crc kubenswrapper[5116]: I0322 00:24:43.313273 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee2acea936ad827175b72459e0f3f311a25df509cce538719e55f747534cdcb7" Mar 22 00:24:43 crc kubenswrapper[5116]: I0322 00:24:43.313275 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 22 00:24:44 crc kubenswrapper[5116]: I0322 00:24:44.592830 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "8a3c0606-0188-415f-9d3f-b6477cda110e" (UID: "8a3c0606-0188-415f-9d3f-b6477cda110e"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:44 crc kubenswrapper[5116]: I0322 00:24:44.668254 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8a3c0606-0188-415f-9d3f-b6477cda110e-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.110474 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111640 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="extract-utilities" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111905 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="extract-utilities" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111924 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="manage-dockerfile" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111932 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="manage-dockerfile" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111959 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="docker-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111967 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="docker-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111980 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="registry-server" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.111988 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="registry-server" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112009 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="git-clone" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112016 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="git-clone" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112024 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd880bf8-6058-4924-8268-c4cdcd44bdcf" containerName="oc" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112030 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd880bf8-6058-4924-8268-c4cdcd44bdcf" containerName="oc" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112041 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="extract-content" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112048 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="extract-content" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112186 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd880bf8-6058-4924-8268-c4cdcd44bdcf" containerName="oc" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112205 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="49647030-2576-4187-ba7e-d7514161f53d" containerName="registry-server" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.112215 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a3c0606-0188-415f-9d3f-b6477cda110e" containerName="docker-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.118832 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.124422 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-1-global-ca\"" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.124434 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-1-ca\"" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.124427 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-1-sys-config\"" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.124976 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.127817 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.201793 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.201849 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.201872 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.201907 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202036 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202083 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202134 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202180 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202212 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2nqt\" (UniqueName: \"kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202255 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202278 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.202335 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304190 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304249 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304305 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304333 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304356 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2nqt\" (UniqueName: \"kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304583 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304643 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.304876 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305013 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305066 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305081 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305099 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305143 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305250 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305553 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305631 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305723 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.305812 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.306109 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.306149 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.306224 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.313527 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.314693 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.324149 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2nqt\" (UniqueName: \"kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt\") pod \"sg-core-1-build\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.434354 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.632806 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:47 crc kubenswrapper[5116]: W0322 00:24:47.639387 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod485bf248_0704_4af5_b4b4_f349855d45a7.slice/crio-2190e9378e89191d390f2c65c4c525c23aef0fb302f3a2e896acf05cc78d5a0e WatchSource:0}: Error finding container 2190e9378e89191d390f2c65c4c525c23aef0fb302f3a2e896acf05cc78d5a0e: Status 404 returned error can't find the container with id 2190e9378e89191d390f2c65c4c525c23aef0fb302f3a2e896acf05cc78d5a0e Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.826665 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.832782 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:47 crc kubenswrapper[5116]: I0322 00:24:47.836519 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.016150 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.016303 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.016367 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zlzs\" (UniqueName: \"kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.117869 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.117932 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.117972 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zlzs\" (UniqueName: \"kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.118427 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.118501 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.137315 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zlzs\" (UniqueName: \"kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs\") pod \"redhat-operators-97d7p\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.156786 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.362533 5116 generic.go:358] "Generic (PLEG): container finished" podID="485bf248-0704-4af5-b4b4-f349855d45a7" containerID="97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf" exitCode=0 Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.362798 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"485bf248-0704-4af5-b4b4-f349855d45a7","Type":"ContainerDied","Data":"97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf"} Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.362845 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"485bf248-0704-4af5-b4b4-f349855d45a7","Type":"ContainerStarted","Data":"2190e9378e89191d390f2c65c4c525c23aef0fb302f3a2e896acf05cc78d5a0e"} Mar 22 00:24:48 crc kubenswrapper[5116]: I0322 00:24:48.383245 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:24:49 crc kubenswrapper[5116]: I0322 00:24:49.373343 5116 generic.go:358] "Generic (PLEG): container finished" podID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerID="cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907" exitCode=0 Mar 22 00:24:49 crc kubenswrapper[5116]: I0322 00:24:49.373409 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerDied","Data":"cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907"} Mar 22 00:24:49 crc kubenswrapper[5116]: I0322 00:24:49.374070 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerStarted","Data":"26cf5ed539e64f0184332ca616fb1f61e216f05f16c7ffbf5dede9ab4123f429"} Mar 22 00:24:49 crc kubenswrapper[5116]: I0322 00:24:49.376516 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"485bf248-0704-4af5-b4b4-f349855d45a7","Type":"ContainerStarted","Data":"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447"} Mar 22 00:24:49 crc kubenswrapper[5116]: I0322 00:24:49.427634 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=2.427618304 podStartE2EDuration="2.427618304s" podCreationTimestamp="2026-03-22 00:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:24:49.426426086 +0000 UTC m=+960.448727459" watchObservedRunningTime="2026-03-22 00:24:49.427618304 +0000 UTC m=+960.449919677" Mar 22 00:24:50 crc kubenswrapper[5116]: I0322 00:24:50.556270 5116 scope.go:117] "RemoveContainer" containerID="208d35041a700bcc47fefb636464fe18464c55b6addf5e55a5a1888e5fa3efb2" Mar 22 00:24:51 crc kubenswrapper[5116]: I0322 00:24:51.393243 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerStarted","Data":"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906"} Mar 22 00:24:52 crc kubenswrapper[5116]: I0322 00:24:52.418246 5116 generic.go:358] "Generic (PLEG): container finished" podID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerID="6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906" exitCode=0 Mar 22 00:24:52 crc kubenswrapper[5116]: I0322 00:24:52.418316 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerDied","Data":"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906"} Mar 22 00:24:53 crc kubenswrapper[5116]: I0322 00:24:53.056850 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:24:53 crc kubenswrapper[5116]: I0322 00:24:53.057253 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:24:53 crc kubenswrapper[5116]: I0322 00:24:53.426507 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerStarted","Data":"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c"} Mar 22 00:24:53 crc kubenswrapper[5116]: I0322 00:24:53.449328 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-97d7p" podStartSLOduration=5.423232879 podStartE2EDuration="6.449306872s" podCreationTimestamp="2026-03-22 00:24:47 +0000 UTC" firstStartedPulling="2026-03-22 00:24:49.376500578 +0000 UTC m=+960.398801991" lastFinishedPulling="2026-03-22 00:24:50.402574571 +0000 UTC m=+961.424875984" observedRunningTime="2026-03-22 00:24:53.443220451 +0000 UTC m=+964.465521824" watchObservedRunningTime="2026-03-22 00:24:53.449306872 +0000 UTC m=+964.471608245" Mar 22 00:24:57 crc kubenswrapper[5116]: I0322 00:24:57.678142 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:57 crc kubenswrapper[5116]: I0322 00:24:57.679117 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="docker-build" containerID="cri-o://9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447" gracePeriod=30 Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.101453 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_485bf248-0704-4af5-b4b4-f349855d45a7/docker-build/0.log" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.102451 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.157197 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.157267 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158432 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158522 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158577 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158637 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158675 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158695 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158742 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2nqt\" (UniqueName: \"kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158795 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158833 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158887 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158938 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.158968 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull\") pod \"485bf248-0704-4af5-b4b4-f349855d45a7\" (UID: \"485bf248-0704-4af5-b4b4-f349855d45a7\") " Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159090 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159486 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159731 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159774 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159845 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.159859 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.160406 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.164812 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt" (OuterVolumeSpecName: "kube-api-access-t2nqt") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "kube-api-access-t2nqt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.166354 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.167253 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.196328 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.258390 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260211 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260245 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260257 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260271 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260283 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260292 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260300 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t2nqt\" (UniqueName: \"kubernetes.io/projected/485bf248-0704-4af5-b4b4-f349855d45a7-kube-api-access-t2nqt\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260308 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260316 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/485bf248-0704-4af5-b4b4-f349855d45a7-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260328 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/485bf248-0704-4af5-b4b4-f349855d45a7-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.260339 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/485bf248-0704-4af5-b4b4-f349855d45a7-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.390074 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "485bf248-0704-4af5-b4b4-f349855d45a7" (UID: "485bf248-0704-4af5-b4b4-f349855d45a7"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.461861 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/485bf248-0704-4af5-b4b4-f349855d45a7-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.463319 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_485bf248-0704-4af5-b4b4-f349855d45a7/docker-build/0.log" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.463876 5116 generic.go:358] "Generic (PLEG): container finished" podID="485bf248-0704-4af5-b4b4-f349855d45a7" containerID="9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447" exitCode=1 Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.464871 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.466320 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"485bf248-0704-4af5-b4b4-f349855d45a7","Type":"ContainerDied","Data":"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447"} Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.466486 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"485bf248-0704-4af5-b4b4-f349855d45a7","Type":"ContainerDied","Data":"2190e9378e89191d390f2c65c4c525c23aef0fb302f3a2e896acf05cc78d5a0e"} Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.466563 5116 scope.go:117] "RemoveContainer" containerID="9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.505466 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.513539 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.516797 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.522662 5116 scope.go:117] "RemoveContainer" containerID="97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.558167 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.611315 5116 scope.go:117] "RemoveContainer" containerID="9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447" Mar 22 00:24:58 crc kubenswrapper[5116]: E0322 00:24:58.611842 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447\": container with ID starting with 9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447 not found: ID does not exist" containerID="9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.611895 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447"} err="failed to get container status \"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447\": rpc error: code = NotFound desc = could not find container \"9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447\": container with ID starting with 9e1bc2820c8eafe581010f0311e28903def8ce583e1045d21ad870f141429447 not found: ID does not exist" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.611931 5116 scope.go:117] "RemoveContainer" containerID="97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf" Mar 22 00:24:58 crc kubenswrapper[5116]: E0322 00:24:58.612226 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf\": container with ID starting with 97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf not found: ID does not exist" containerID="97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf" Mar 22 00:24:58 crc kubenswrapper[5116]: I0322 00:24:58.612254 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf"} err="failed to get container status \"97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf\": rpc error: code = NotFound desc = could not find container \"97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf\": container with ID starting with 97245c00e03ada14054ae81897b434bb5516c5f9efd3c8f7bd51ade6f20bbcbf not found: ID does not exist" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.292230 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.293378 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="docker-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.293403 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="docker-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.293432 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="manage-dockerfile" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.293444 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="manage-dockerfile" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.293644 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" containerName="docker-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.349319 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.349460 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.352377 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.352590 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-2-sys-config\"" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.353292 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-2-global-ca\"" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.353570 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-core-2-ca\"" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476696 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476755 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476810 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476832 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476867 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476894 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.476982 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwt49\" (UniqueName: \"kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.477074 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.477113 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.477211 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.477414 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.477454 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578398 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578465 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwt49\" (UniqueName: \"kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578503 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578528 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578592 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578883 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578893 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.578993 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579082 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579263 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579335 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579442 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579480 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579750 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.579913 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.580048 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.580145 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.580357 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.580454 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.580750 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.581125 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.586811 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.591659 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.596546 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwt49\" (UniqueName: \"kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49\") pod \"sg-core-2-build\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.664588 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.708854 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485bf248-0704-4af5-b4b4-f349855d45a7" path="/var/lib/kubelet/pods/485bf248-0704-4af5-b4b4-f349855d45a7/volumes" Mar 22 00:24:59 crc kubenswrapper[5116]: I0322 00:24:59.893316 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 22 00:24:59 crc kubenswrapper[5116]: W0322 00:24:59.895854 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94167a49_edda_444e_bb01_c8b24c818557.slice/crio-53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459 WatchSource:0}: Error finding container 53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459: Status 404 returned error can't find the container with id 53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459 Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.482987 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerStarted","Data":"635157ab685b7378b40d6892c4266da1c8cb5d2f4d61edadd8e0cd2000e9691d"} Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.483418 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerStarted","Data":"53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459"} Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.483372 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-97d7p" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="registry-server" containerID="cri-o://84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c" gracePeriod=2 Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.854806 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.900280 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities\") pod \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.900451 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content\") pod \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.900474 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zlzs\" (UniqueName: \"kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs\") pod \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\" (UID: \"3fadb3c9-a4dd-44da-8b81-a8a3be611d61\") " Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.901394 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities" (OuterVolumeSpecName: "utilities") pod "3fadb3c9-a4dd-44da-8b81-a8a3be611d61" (UID: "3fadb3c9-a4dd-44da-8b81-a8a3be611d61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:25:00 crc kubenswrapper[5116]: I0322 00:25:00.909362 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs" (OuterVolumeSpecName: "kube-api-access-6zlzs") pod "3fadb3c9-a4dd-44da-8b81-a8a3be611d61" (UID: "3fadb3c9-a4dd-44da-8b81-a8a3be611d61"). InnerVolumeSpecName "kube-api-access-6zlzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.002013 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6zlzs\" (UniqueName: \"kubernetes.io/projected/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-kube-api-access-6zlzs\") on node \"crc\" DevicePath \"\"" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.002048 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.028907 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fadb3c9-a4dd-44da-8b81-a8a3be611d61" (UID: "3fadb3c9-a4dd-44da-8b81-a8a3be611d61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.102998 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fadb3c9-a4dd-44da-8b81-a8a3be611d61-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.491312 5116 generic.go:358] "Generic (PLEG): container finished" podID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerID="84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c" exitCode=0 Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.491370 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerDied","Data":"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c"} Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.491434 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-97d7p" event={"ID":"3fadb3c9-a4dd-44da-8b81-a8a3be611d61","Type":"ContainerDied","Data":"26cf5ed539e64f0184332ca616fb1f61e216f05f16c7ffbf5dede9ab4123f429"} Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.491457 5116 scope.go:117] "RemoveContainer" containerID="84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.492357 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-97d7p" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.493700 5116 generic.go:358] "Generic (PLEG): container finished" podID="94167a49-edda-444e-bb01-c8b24c818557" containerID="635157ab685b7378b40d6892c4266da1c8cb5d2f4d61edadd8e0cd2000e9691d" exitCode=0 Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.493883 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerDied","Data":"635157ab685b7378b40d6892c4266da1c8cb5d2f4d61edadd8e0cd2000e9691d"} Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.517371 5116 scope.go:117] "RemoveContainer" containerID="6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.548229 5116 scope.go:117] "RemoveContainer" containerID="cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.553441 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.558892 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-97d7p"] Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.603309 5116 scope.go:117] "RemoveContainer" containerID="84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c" Mar 22 00:25:01 crc kubenswrapper[5116]: E0322 00:25:01.603614 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c\": container with ID starting with 84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c not found: ID does not exist" containerID="84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.603645 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c"} err="failed to get container status \"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c\": rpc error: code = NotFound desc = could not find container \"84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c\": container with ID starting with 84c85179860da6ce37f4f4dc3aef0ac07cd5b4bce1e288b876edc160e15f701c not found: ID does not exist" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.603662 5116 scope.go:117] "RemoveContainer" containerID="6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906" Mar 22 00:25:01 crc kubenswrapper[5116]: E0322 00:25:01.604434 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906\": container with ID starting with 6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906 not found: ID does not exist" containerID="6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.604455 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906"} err="failed to get container status \"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906\": rpc error: code = NotFound desc = could not find container \"6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906\": container with ID starting with 6641078c6ff5afeffcd6068753df9300c0519023b5a92376b99a9a5831718906 not found: ID does not exist" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.604467 5116 scope.go:117] "RemoveContainer" containerID="cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907" Mar 22 00:25:01 crc kubenswrapper[5116]: E0322 00:25:01.604691 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907\": container with ID starting with cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907 not found: ID does not exist" containerID="cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.604712 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907"} err="failed to get container status \"cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907\": rpc error: code = NotFound desc = could not find container \"cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907\": container with ID starting with cf9c4e6e32cbba6521573cad17d8e29d39451985a3225355720a084120608907 not found: ID does not exist" Mar 22 00:25:01 crc kubenswrapper[5116]: I0322 00:25:01.704792 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" path="/var/lib/kubelet/pods/3fadb3c9-a4dd-44da-8b81-a8a3be611d61/volumes" Mar 22 00:25:02 crc kubenswrapper[5116]: I0322 00:25:02.504643 5116 generic.go:358] "Generic (PLEG): container finished" podID="94167a49-edda-444e-bb01-c8b24c818557" containerID="6c25378c5c67a582ced66fed35969fa2d19e2271e0060a1d9027bc652a4efbd5" exitCode=0 Mar 22 00:25:02 crc kubenswrapper[5116]: I0322 00:25:02.504754 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerDied","Data":"6c25378c5c67a582ced66fed35969fa2d19e2271e0060a1d9027bc652a4efbd5"} Mar 22 00:25:02 crc kubenswrapper[5116]: I0322 00:25:02.536060 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_94167a49-edda-444e-bb01-c8b24c818557/manage-dockerfile/0.log" Mar 22 00:25:03 crc kubenswrapper[5116]: I0322 00:25:03.513905 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerStarted","Data":"32d83cf2cd3027aeeae521f32493abec068b21a8573e03777a412660a6ffef93"} Mar 22 00:25:03 crc kubenswrapper[5116]: I0322 00:25:03.537495 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=4.537477629 podStartE2EDuration="4.537477629s" podCreationTimestamp="2026-03-22 00:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:25:03.532791865 +0000 UTC m=+974.555093248" watchObservedRunningTime="2026-03-22 00:25:03.537477629 +0000 UTC m=+974.559779002" Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.056989 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.057637 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.057699 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.058454 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.058538 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029" gracePeriod=600 Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.643080 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029" exitCode=0 Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.643258 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029"} Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.643466 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd"} Mar 22 00:25:23 crc kubenswrapper[5116]: I0322 00:25:23.643485 5116 scope.go:117] "RemoveContainer" containerID="2bace879b3cb84a0484101d5bad3c5693b65e8c5fa47a655bcdd4b8fee4ab4a2" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.143890 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568986-l778x"] Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146015 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="registry-server" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146044 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="registry-server" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146095 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="extract-content" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146107 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="extract-content" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146132 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="extract-utilities" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146144 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="extract-utilities" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.146371 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fadb3c9-a4dd-44da-8b81-a8a3be611d61" containerName="registry-server" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.449144 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568986-l778x"] Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.449320 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.455446 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.455872 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.467585 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.627016 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2smj\" (UniqueName: \"kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj\") pod \"auto-csr-approver-29568986-l778x\" (UID: \"d42dbb69-e840-4b6a-b719-52396f82919e\") " pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.728249 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2smj\" (UniqueName: \"kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj\") pod \"auto-csr-approver-29568986-l778x\" (UID: \"d42dbb69-e840-4b6a-b719-52396f82919e\") " pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.759872 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2smj\" (UniqueName: \"kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj\") pod \"auto-csr-approver-29568986-l778x\" (UID: \"d42dbb69-e840-4b6a-b719-52396f82919e\") " pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:00 crc kubenswrapper[5116]: I0322 00:26:00.784019 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:01 crc kubenswrapper[5116]: I0322 00:26:01.023408 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568986-l778x"] Mar 22 00:26:01 crc kubenswrapper[5116]: I0322 00:26:01.958417 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568986-l778x" event={"ID":"d42dbb69-e840-4b6a-b719-52396f82919e","Type":"ContainerStarted","Data":"aa4ca9133814e5f48cc6f2b6eaadca8bbcb7b070e4190ce04658801f3f251cc0"} Mar 22 00:26:02 crc kubenswrapper[5116]: I0322 00:26:02.965995 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568986-l778x" event={"ID":"d42dbb69-e840-4b6a-b719-52396f82919e","Type":"ContainerStarted","Data":"795e259e79e05542df53f912755d650098dd25713e478a8d61c158fc1da118ef"} Mar 22 00:26:03 crc kubenswrapper[5116]: I0322 00:26:03.983008 5116 generic.go:358] "Generic (PLEG): container finished" podID="d42dbb69-e840-4b6a-b719-52396f82919e" containerID="795e259e79e05542df53f912755d650098dd25713e478a8d61c158fc1da118ef" exitCode=0 Mar 22 00:26:03 crc kubenswrapper[5116]: I0322 00:26:03.983158 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568986-l778x" event={"ID":"d42dbb69-e840-4b6a-b719-52396f82919e","Type":"ContainerDied","Data":"795e259e79e05542df53f912755d650098dd25713e478a8d61c158fc1da118ef"} Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.262302 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.390002 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2smj\" (UniqueName: \"kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj\") pod \"d42dbb69-e840-4b6a-b719-52396f82919e\" (UID: \"d42dbb69-e840-4b6a-b719-52396f82919e\") " Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.398096 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj" (OuterVolumeSpecName: "kube-api-access-q2smj") pod "d42dbb69-e840-4b6a-b719-52396f82919e" (UID: "d42dbb69-e840-4b6a-b719-52396f82919e"). InnerVolumeSpecName "kube-api-access-q2smj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.491620 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q2smj\" (UniqueName: \"kubernetes.io/projected/d42dbb69-e840-4b6a-b719-52396f82919e-kube-api-access-q2smj\") on node \"crc\" DevicePath \"\"" Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.999593 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568986-l778x" Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.999665 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568986-l778x" event={"ID":"d42dbb69-e840-4b6a-b719-52396f82919e","Type":"ContainerDied","Data":"aa4ca9133814e5f48cc6f2b6eaadca8bbcb7b070e4190ce04658801f3f251cc0"} Mar 22 00:26:05 crc kubenswrapper[5116]: I0322 00:26:05.999701 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa4ca9133814e5f48cc6f2b6eaadca8bbcb7b070e4190ce04658801f3f251cc0" Mar 22 00:26:06 crc kubenswrapper[5116]: I0322 00:26:06.324070 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568980-ksbk2"] Mar 22 00:26:06 crc kubenswrapper[5116]: I0322 00:26:06.332436 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568980-ksbk2"] Mar 22 00:26:07 crc kubenswrapper[5116]: I0322 00:26:07.708562 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11840734-dc87-4532-b341-aeb889f011c4" path="/var/lib/kubelet/pods/11840734-dc87-4532-b341-aeb889f011c4/volumes" Mar 22 00:26:50 crc kubenswrapper[5116]: I0322 00:26:50.703634 5116 scope.go:117] "RemoveContainer" containerID="8af1cbb249920bcf15fdbe5b5c0f46798a3269524c401605be7ddebc1e8b62d8" Mar 22 00:27:23 crc kubenswrapper[5116]: I0322 00:27:23.057046 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:27:23 crc kubenswrapper[5116]: I0322 00:27:23.057679 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:27:53 crc kubenswrapper[5116]: I0322 00:27:53.057214 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:27:53 crc kubenswrapper[5116]: I0322 00:27:53.057766 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.135597 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568988-gbkp8"] Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.136993 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d42dbb69-e840-4b6a-b719-52396f82919e" containerName="oc" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.137013 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="d42dbb69-e840-4b6a-b719-52396f82919e" containerName="oc" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.137157 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="d42dbb69-e840-4b6a-b719-52396f82919e" containerName="oc" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.159158 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568988-gbkp8"] Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.159305 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.162103 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.162108 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.163000 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.241132 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h9v8\" (UniqueName: \"kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8\") pod \"auto-csr-approver-29568988-gbkp8\" (UID: \"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39\") " pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.342381 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2h9v8\" (UniqueName: \"kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8\") pod \"auto-csr-approver-29568988-gbkp8\" (UID: \"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39\") " pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.363417 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h9v8\" (UniqueName: \"kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8\") pod \"auto-csr-approver-29568988-gbkp8\" (UID: \"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39\") " pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.476762 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.684948 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568988-gbkp8"] Mar 22 00:28:00 crc kubenswrapper[5116]: I0322 00:28:00.746586 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" event={"ID":"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39","Type":"ContainerStarted","Data":"730f3b13214d7bbd73298973658c8d4e1cd3372e57cc1ed17de997720a904f1a"} Mar 22 00:28:06 crc kubenswrapper[5116]: I0322 00:28:06.788696 5116 generic.go:358] "Generic (PLEG): container finished" podID="135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" containerID="6aa0210430a18d2b96a0bdd3c189fd69455e4afb5753bc588ac349572da2555d" exitCode=0 Mar 22 00:28:06 crc kubenswrapper[5116]: I0322 00:28:06.788806 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" event={"ID":"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39","Type":"ContainerDied","Data":"6aa0210430a18d2b96a0bdd3c189fd69455e4afb5753bc588ac349572da2555d"} Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.006872 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.154126 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h9v8\" (UniqueName: \"kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8\") pod \"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39\" (UID: \"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39\") " Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.164855 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8" (OuterVolumeSpecName: "kube-api-access-2h9v8") pod "135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" (UID: "135f0dfe-78e2-4264-ae7b-7d6b95ebbb39"). InnerVolumeSpecName "kube-api-access-2h9v8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.256305 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2h9v8\" (UniqueName: \"kubernetes.io/projected/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39-kube-api-access-2h9v8\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.802617 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.802647 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568988-gbkp8" event={"ID":"135f0dfe-78e2-4264-ae7b-7d6b95ebbb39","Type":"ContainerDied","Data":"730f3b13214d7bbd73298973658c8d4e1cd3372e57cc1ed17de997720a904f1a"} Mar 22 00:28:08 crc kubenswrapper[5116]: I0322 00:28:08.802688 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="730f3b13214d7bbd73298973658c8d4e1cd3372e57cc1ed17de997720a904f1a" Mar 22 00:28:09 crc kubenswrapper[5116]: I0322 00:28:09.072484 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568982-7k9wz"] Mar 22 00:28:09 crc kubenswrapper[5116]: I0322 00:28:09.076256 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568982-7k9wz"] Mar 22 00:28:09 crc kubenswrapper[5116]: I0322 00:28:09.712271 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26bdb492-c2c3-48e7-b86a-b83cb2f4aea5" path="/var/lib/kubelet/pods/26bdb492-c2c3-48e7-b86a-b83cb2f4aea5/volumes" Mar 22 00:28:15 crc kubenswrapper[5116]: I0322 00:28:15.863700 5116 generic.go:358] "Generic (PLEG): container finished" podID="94167a49-edda-444e-bb01-c8b24c818557" containerID="32d83cf2cd3027aeeae521f32493abec068b21a8573e03777a412660a6ffef93" exitCode=0 Mar 22 00:28:15 crc kubenswrapper[5116]: I0322 00:28:15.863798 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerDied","Data":"32d83cf2cd3027aeeae521f32493abec068b21a8573e03777a412660a6ffef93"} Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.221815 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.289897 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwt49\" (UniqueName: \"kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.289953 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.289980 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290005 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290026 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290050 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290077 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290188 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290215 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290255 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290285 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290293 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290322 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.290430 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run\") pod \"94167a49-edda-444e-bb01-c8b24c818557\" (UID: \"94167a49-edda-444e-bb01-c8b24c818557\") " Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.291012 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.291207 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.291241 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.291365 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.291425 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/94167a49-edda-444e-bb01-c8b24c818557-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.335058 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.373894 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.381502 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.393005 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.393045 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/94167a49-edda-444e-bb01-c8b24c818557-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.393058 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.393070 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.419664 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.432410 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.436590 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49" (OuterVolumeSpecName: "kube-api-access-lwt49") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "kube-api-access-lwt49". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.494375 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/94167a49-edda-444e-bb01-c8b24c818557-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.494421 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.494439 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lwt49\" (UniqueName: \"kubernetes.io/projected/94167a49-edda-444e-bb01-c8b24c818557-kube-api-access-lwt49\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.661750 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.696841 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.884496 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"94167a49-edda-444e-bb01-c8b24c818557","Type":"ContainerDied","Data":"53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459"} Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.884544 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53613d79a9b2b5b46d05a807ddfaa532608c11cacf61914f830a9347006a7459" Mar 22 00:28:17 crc kubenswrapper[5116]: I0322 00:28:17.884628 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 22 00:28:20 crc kubenswrapper[5116]: I0322 00:28:20.415671 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "94167a49-edda-444e-bb01-c8b24c818557" (UID: "94167a49-edda-444e-bb01-c8b24c818557"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:20 crc kubenswrapper[5116]: I0322 00:28:20.436442 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/94167a49-edda-444e-bb01-c8b24c818557-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.035080 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036830 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" containerName="oc" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036864 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" containerName="oc" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036883 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="docker-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036890 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="docker-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036910 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="git-clone" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036929 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="git-clone" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036938 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="manage-dockerfile" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.036944 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="manage-dockerfile" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.037064 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" containerName="oc" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.037082 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="94167a49-edda-444e-bb01-c8b24c818557" containerName="docker-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.097749 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.097920 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.100603 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-1-sys-config\"" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.100624 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.100629 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-1-global-ca\"" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.100761 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-1-ca\"" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270236 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270296 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270325 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270512 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270580 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270643 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270672 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270715 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270746 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntttm\" (UniqueName: \"kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270834 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270904 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.270930 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372543 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372610 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372642 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372657 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372682 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372698 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntttm\" (UniqueName: \"kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372728 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372759 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.372774 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373060 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373199 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373388 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373424 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373438 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373558 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373628 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373657 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373662 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373824 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.373890 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.374477 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.382800 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.383615 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.389861 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntttm\" (UniqueName: \"kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm\") pod \"sg-bridge-1-build\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.409993 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.610019 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:22 crc kubenswrapper[5116]: I0322 00:28:22.930146 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"d7582c75-581a-41a0-9a56-c0eda9df5932","Type":"ContainerStarted","Data":"771bfa800f24952fe981a0692f46e9fb7737913672821b07cfca47622fd48f98"} Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.057522 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.057599 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.057656 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.058302 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.058368 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd" gracePeriod=600 Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.944391 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd" exitCode=0 Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.944504 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd"} Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.944836 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3"} Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.944873 5116 scope.go:117] "RemoveContainer" containerID="04d3547ec13e400d85bea3193ce33dfe9d5bb94e3b6f8025eabfb2cf4e55b029" Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.948442 5116 generic.go:358] "Generic (PLEG): container finished" podID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerID="d50c7a1c36052a74413838c90756db463a10e07fe6248f6acd9c877e8aa8ecfb" exitCode=0 Mar 22 00:28:23 crc kubenswrapper[5116]: I0322 00:28:23.948649 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"d7582c75-581a-41a0-9a56-c0eda9df5932","Type":"ContainerDied","Data":"d50c7a1c36052a74413838c90756db463a10e07fe6248f6acd9c877e8aa8ecfb"} Mar 22 00:28:24 crc kubenswrapper[5116]: I0322 00:28:24.959078 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"d7582c75-581a-41a0-9a56-c0eda9df5932","Type":"ContainerStarted","Data":"096ae5fa3279f5daede4cba4128f49869a9b4e67267b09ec957a247283477413"} Mar 22 00:28:24 crc kubenswrapper[5116]: I0322 00:28:24.977385 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=2.977367562 podStartE2EDuration="2.977367562s" podCreationTimestamp="2026-03-22 00:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:28:24.976492995 +0000 UTC m=+1175.998794368" watchObservedRunningTime="2026-03-22 00:28:24.977367562 +0000 UTC m=+1175.999668935" Mar 22 00:28:33 crc kubenswrapper[5116]: I0322 00:28:32.506281 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:33 crc kubenswrapper[5116]: I0322 00:28:32.507157 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/sg-bridge-1-build" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="docker-build" containerID="cri-o://096ae5fa3279f5daede4cba4128f49869a9b4e67267b09ec957a247283477413" gracePeriod=30 Mar 22 00:28:34 crc kubenswrapper[5116]: I0322 00:28:34.876773 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.263859 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.264135 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.267083 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-2-global-ca\"" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.267083 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-2-sys-config\"" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.267361 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"sg-bridge-2-ca\"" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376244 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376368 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6cg7\" (UniqueName: \"kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376411 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376606 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376692 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376814 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376868 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376951 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.376996 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.377044 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.377076 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.377213 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.479303 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.479722 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.479746 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480054 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480134 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480239 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480396 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480502 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480557 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480484 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480667 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.480858 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.481021 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r6cg7\" (UniqueName: \"kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.481073 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.481765 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.482004 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.482417 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.482436 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.482437 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.482854 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.483086 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.500158 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.500863 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.514076 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6cg7\" (UniqueName: \"kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7\") pod \"sg-bridge-2-build\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:35 crc kubenswrapper[5116]: I0322 00:28:35.583630 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.046484 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_d7582c75-581a-41a0-9a56-c0eda9df5932/docker-build/0.log" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.047380 5116 generic.go:358] "Generic (PLEG): container finished" podID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerID="096ae5fa3279f5daede4cba4128f49869a9b4e67267b09ec957a247283477413" exitCode=1 Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.047582 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"d7582c75-581a-41a0-9a56-c0eda9df5932","Type":"ContainerDied","Data":"096ae5fa3279f5daede4cba4128f49869a9b4e67267b09ec957a247283477413"} Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.135713 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 22 00:28:36 crc kubenswrapper[5116]: W0322 00:28:36.141530 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef337a4f_2072_4bd4_b8bf_c6f17e8de5c0.slice/crio-295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972 WatchSource:0}: Error finding container 295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972: Status 404 returned error can't find the container with id 295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972 Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.772632 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_d7582c75-581a-41a0-9a56-c0eda9df5932/docker-build/0.log" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.773369 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903517 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903561 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903579 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903625 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903678 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntttm\" (UniqueName: \"kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903696 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903698 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903724 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903782 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903898 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.903960 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.904060 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.904110 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run\") pod \"d7582c75-581a-41a0-9a56-c0eda9df5932\" (UID: \"d7582c75-581a-41a0-9a56-c0eda9df5932\") " Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.904573 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.904761 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.904815 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.905498 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.905752 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.906219 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.906439 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.907470 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.907578 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.911569 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm" (OuterVolumeSpecName: "kube-api-access-ntttm") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "kube-api-access-ntttm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.912782 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.917630 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:28:36 crc kubenswrapper[5116]: I0322 00:28:36.977364 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d7582c75-581a-41a0-9a56-c0eda9df5932" (UID: "d7582c75-581a-41a0-9a56-c0eda9df5932"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.006210 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.006473 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.006635 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/d7582c75-581a-41a0-9a56-c0eda9df5932-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.006810 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007024 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ntttm\" (UniqueName: \"kubernetes.io/projected/d7582c75-581a-41a0-9a56-c0eda9df5932-kube-api-access-ntttm\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007232 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007468 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007612 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d7582c75-581a-41a0-9a56-c0eda9df5932-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007743 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d7582c75-581a-41a0-9a56-c0eda9df5932-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.007872 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d7582c75-581a-41a0-9a56-c0eda9df5932-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.058191 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerStarted","Data":"05b58eee940a9312ea4c72bf7cd6002eb63f68736ff8b05594077378816eeb27"} Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.058246 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerStarted","Data":"295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972"} Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.061247 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_d7582c75-581a-41a0-9a56-c0eda9df5932/docker-build/0.log" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.061589 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"d7582c75-581a-41a0-9a56-c0eda9df5932","Type":"ContainerDied","Data":"771bfa800f24952fe981a0692f46e9fb7737913672821b07cfca47622fd48f98"} Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.061634 5116 scope.go:117] "RemoveContainer" containerID="096ae5fa3279f5daede4cba4128f49869a9b4e67267b09ec957a247283477413" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.061782 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.099277 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.106988 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.112201 5116 scope.go:117] "RemoveContainer" containerID="d50c7a1c36052a74413838c90756db463a10e07fe6248f6acd9c877e8aa8ecfb" Mar 22 00:28:37 crc kubenswrapper[5116]: I0322 00:28:37.704275 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" path="/var/lib/kubelet/pods/d7582c75-581a-41a0-9a56-c0eda9df5932/volumes" Mar 22 00:28:38 crc kubenswrapper[5116]: I0322 00:28:38.072484 5116 generic.go:358] "Generic (PLEG): container finished" podID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerID="05b58eee940a9312ea4c72bf7cd6002eb63f68736ff8b05594077378816eeb27" exitCode=0 Mar 22 00:28:38 crc kubenswrapper[5116]: I0322 00:28:38.072553 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerDied","Data":"05b58eee940a9312ea4c72bf7cd6002eb63f68736ff8b05594077378816eeb27"} Mar 22 00:28:39 crc kubenswrapper[5116]: I0322 00:28:39.080723 5116 generic.go:358] "Generic (PLEG): container finished" podID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerID="095801c54210d9a2713b5cbe7ba8078ce5eb8730ede892ce28c5c1b4dcc05f21" exitCode=0 Mar 22 00:28:39 crc kubenswrapper[5116]: I0322 00:28:39.080767 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerDied","Data":"095801c54210d9a2713b5cbe7ba8078ce5eb8730ede892ce28c5c1b4dcc05f21"} Mar 22 00:28:39 crc kubenswrapper[5116]: I0322 00:28:39.121323 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0/manage-dockerfile/0.log" Mar 22 00:28:40 crc kubenswrapper[5116]: I0322 00:28:40.089061 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerStarted","Data":"4fdc43f3d25442590d2fddb2c31c0e77f73d2b4f4d2d5df7d52f1ad537e21d5d"} Mar 22 00:28:40 crc kubenswrapper[5116]: I0322 00:28:40.110648 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=6.110631231 podStartE2EDuration="6.110631231s" podCreationTimestamp="2026-03-22 00:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:28:40.110112985 +0000 UTC m=+1191.132414358" watchObservedRunningTime="2026-03-22 00:28:40.110631231 +0000 UTC m=+1191.132932604" Mar 22 00:28:50 crc kubenswrapper[5116]: I0322 00:28:50.056684 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:28:50 crc kubenswrapper[5116]: I0322 00:28:50.075962 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:28:50 crc kubenswrapper[5116]: I0322 00:28:50.082565 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:28:50 crc kubenswrapper[5116]: I0322 00:28:50.085006 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:28:50 crc kubenswrapper[5116]: I0322 00:28:50.836809 5116 scope.go:117] "RemoveContainer" containerID="2ddc5261a5a71c38bc0367d296e6678e1f3648ba5a05ce953a8407e9e3ce8a74" Mar 22 00:29:28 crc kubenswrapper[5116]: I0322 00:29:28.422775 5116 generic.go:358] "Generic (PLEG): container finished" podID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerID="4fdc43f3d25442590d2fddb2c31c0e77f73d2b4f4d2d5df7d52f1ad537e21d5d" exitCode=0 Mar 22 00:29:28 crc kubenswrapper[5116]: I0322 00:29:28.422942 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerDied","Data":"4fdc43f3d25442590d2fddb2c31c0e77f73d2b4f4d2d5df7d52f1ad537e21d5d"} Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.702289 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854199 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854263 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854283 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854315 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6cg7\" (UniqueName: \"kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854338 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854837 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854908 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854923 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854944 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.854978 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855002 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855016 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855068 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run\") pod \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\" (UID: \"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0\") " Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855349 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855335 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855526 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.855812 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.856352 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.857315 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.857953 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.859291 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.859452 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.859548 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7" (OuterVolumeSpecName: "kube-api-access-r6cg7") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "kube-api-access-r6cg7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956239 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956269 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956278 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956287 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956295 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956303 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956311 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956319 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.956326 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r6cg7\" (UniqueName: \"kubernetes.io/projected/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-kube-api-access-r6cg7\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:29 crc kubenswrapper[5116]: I0322 00:29:29.988175 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.058414 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.451684 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0","Type":"ContainerDied","Data":"295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972"} Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.451729 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="295b8055add135e795a4f83a474e1897406680ac19e00964983493bcb6166972" Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.451822 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.652006 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" (UID: "ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:30 crc kubenswrapper[5116]: I0322 00:29:30.667756 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:30 crc kubenswrapper[5116]: E0322 00:29:30.782408 5116 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef337a4f_2072_4bd4_b8bf_c6f17e8de5c0.slice\": RecentStats: unable to find data in memory cache]" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.007984 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009149 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="manage-dockerfile" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009188 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="manage-dockerfile" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009215 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="manage-dockerfile" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009223 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="manage-dockerfile" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009236 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009244 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009281 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009289 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009297 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="git-clone" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009306 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="git-clone" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009442 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7582c75-581a-41a0-9a56-c0eda9df5932" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.009458 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef337a4f-2072-4bd4-b8bf-c6f17e8de5c0" containerName="docker-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.039048 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.039223 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.041960 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.042099 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-1-global-ca\"" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.042717 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-1-sys-config\"" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.044312 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-1-ca\"" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121453 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121521 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121557 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121582 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121602 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121624 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121643 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121688 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121851 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121868 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crmgq\" (UniqueName: \"kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121893 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.121911 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223129 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223297 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223347 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223461 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223506 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223531 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223585 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223638 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223679 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.223733 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.224054 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crmgq\" (UniqueName: \"kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.224137 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.224232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.224808 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.224877 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.225051 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.225077 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.225313 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.225585 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.225950 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.233737 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.240034 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.249818 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crmgq\" (UniqueName: \"kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.358103 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.793228 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:34 crc kubenswrapper[5116]: I0322 00:29:34.798569 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:29:35 crc kubenswrapper[5116]: I0322 00:29:35.489222 5116 generic.go:358] "Generic (PLEG): container finished" podID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerID="1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa" exitCode=0 Mar 22 00:29:35 crc kubenswrapper[5116]: I0322 00:29:35.489465 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"dc017a91-5056-42fd-8d59-57baacaf0c14","Type":"ContainerDied","Data":"1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa"} Mar 22 00:29:35 crc kubenswrapper[5116]: I0322 00:29:35.490419 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"dc017a91-5056-42fd-8d59-57baacaf0c14","Type":"ContainerStarted","Data":"42db315b2f8931ff27b8e8122f743c37cd2891b092b07d7e09f6d9b04be78a9b"} Mar 22 00:29:36 crc kubenswrapper[5116]: I0322 00:29:36.499700 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"dc017a91-5056-42fd-8d59-57baacaf0c14","Type":"ContainerStarted","Data":"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d"} Mar 22 00:29:36 crc kubenswrapper[5116]: I0322 00:29:36.522883 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.522869772 podStartE2EDuration="3.522869772s" podCreationTimestamp="2026-03-22 00:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:29:36.522611284 +0000 UTC m=+1247.544912667" watchObservedRunningTime="2026-03-22 00:29:36.522869772 +0000 UTC m=+1247.545171145" Mar 22 00:29:44 crc kubenswrapper[5116]: I0322 00:29:44.893096 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:44 crc kubenswrapper[5116]: I0322 00:29:44.893671 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="docker-build" containerID="cri-o://cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d" gracePeriod=30 Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.736515 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_dc017a91-5056-42fd-8d59-57baacaf0c14/docker-build/0.log" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.737333 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.752863 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.752918 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.752949 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.752973 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.752997 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753032 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753058 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753085 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753106 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753189 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crmgq\" (UniqueName: \"kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753214 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.753240 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles\") pod \"dc017a91-5056-42fd-8d59-57baacaf0c14\" (UID: \"dc017a91-5056-42fd-8d59-57baacaf0c14\") " Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.755142 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.755779 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.756541 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.757477 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.757535 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.757564 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.762082 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.762846 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.764049 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq" (OuterVolumeSpecName: "kube-api-access-crmgq") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "kube-api-access-crmgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.764066 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.813494 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846079 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_dc017a91-5056-42fd-8d59-57baacaf0c14/docker-build/0.log" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846569 5116 generic.go:358] "Generic (PLEG): container finished" podID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerID="cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d" exitCode=1 Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846622 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"dc017a91-5056-42fd-8d59-57baacaf0c14","Type":"ContainerDied","Data":"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d"} Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846686 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"dc017a91-5056-42fd-8d59-57baacaf0c14","Type":"ContainerDied","Data":"42db315b2f8931ff27b8e8122f743c37cd2891b092b07d7e09f6d9b04be78a9b"} Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846707 5116 scope.go:117] "RemoveContainer" containerID="cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.846646 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855574 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855607 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855619 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855631 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855643 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855653 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/dc017a91-5056-42fd-8d59-57baacaf0c14-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855667 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855678 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/dc017a91-5056-42fd-8d59-57baacaf0c14-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855689 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-crmgq\" (UniqueName: \"kubernetes.io/projected/dc017a91-5056-42fd-8d59-57baacaf0c14-kube-api-access-crmgq\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855701 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.855711 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc017a91-5056-42fd-8d59-57baacaf0c14-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.880726 5116 scope.go:117] "RemoveContainer" containerID="1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.940161 5116 scope.go:117] "RemoveContainer" containerID="cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d" Mar 22 00:29:45 crc kubenswrapper[5116]: E0322 00:29:45.940609 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d\": container with ID starting with cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d not found: ID does not exist" containerID="cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.940650 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d"} err="failed to get container status \"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d\": rpc error: code = NotFound desc = could not find container \"cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d\": container with ID starting with cc2e1a43efd40a8a4ff5cc7bf6fe4c4bdfeadaa488aef4e52d401e6f183cd39d not found: ID does not exist" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.940679 5116 scope.go:117] "RemoveContainer" containerID="1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa" Mar 22 00:29:45 crc kubenswrapper[5116]: E0322 00:29:45.941021 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa\": container with ID starting with 1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa not found: ID does not exist" containerID="1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa" Mar 22 00:29:45 crc kubenswrapper[5116]: I0322 00:29:45.941044 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa"} err="failed to get container status \"1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa\": rpc error: code = NotFound desc = could not find container \"1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa\": container with ID starting with 1c8ebaeda44b7a3dff7bb55c721d652206b4b163ee6d5e5d58e60088384434fa not found: ID does not exist" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.080115 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "dc017a91-5056-42fd-8d59-57baacaf0c14" (UID: "dc017a91-5056-42fd-8d59-57baacaf0c14"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.159284 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/dc017a91-5056-42fd-8d59-57baacaf0c14-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.189337 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.195739 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.578294 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.579399 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="docker-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.579428 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="docker-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.579514 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="manage-dockerfile" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.579525 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="manage-dockerfile" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.579746 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" containerName="docker-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.626987 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.627070 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.628925 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-2-global-ca\"" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.628976 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-2-ca\"" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.629239 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-webhook-snmp-2-sys-config\"" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.630196 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665085 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665482 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665517 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65nzh\" (UniqueName: \"kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665573 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665605 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665634 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665667 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665708 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665736 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665799 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665825 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.665860 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767588 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767631 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767670 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767691 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767723 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767748 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767756 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767824 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767770 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767885 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65nzh\" (UniqueName: \"kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767921 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.767960 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768001 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768036 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768309 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768538 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768626 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.768701 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.769331 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.769850 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.770249 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.775346 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.779466 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.793150 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65nzh\" (UniqueName: \"kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:46 crc kubenswrapper[5116]: I0322 00:29:46.950106 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:29:47 crc kubenswrapper[5116]: I0322 00:29:47.140418 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 22 00:29:47 crc kubenswrapper[5116]: I0322 00:29:47.709630 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc017a91-5056-42fd-8d59-57baacaf0c14" path="/var/lib/kubelet/pods/dc017a91-5056-42fd-8d59-57baacaf0c14/volumes" Mar 22 00:29:47 crc kubenswrapper[5116]: I0322 00:29:47.866656 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerStarted","Data":"9f1455b4e1d683b8147a4e7cbe39bfd2b64e3db289d053d240e16d814779b540"} Mar 22 00:29:47 crc kubenswrapper[5116]: I0322 00:29:47.866743 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerStarted","Data":"a8d14e8cd61c30324bd428b5c66778dfcef4610119afccd6d0a42381bcfec935"} Mar 22 00:29:48 crc kubenswrapper[5116]: I0322 00:29:48.872857 5116 generic.go:358] "Generic (PLEG): container finished" podID="80323414-c785-4e29-ac99-d15e78a522e6" containerID="9f1455b4e1d683b8147a4e7cbe39bfd2b64e3db289d053d240e16d814779b540" exitCode=0 Mar 22 00:29:48 crc kubenswrapper[5116]: I0322 00:29:48.872915 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerDied","Data":"9f1455b4e1d683b8147a4e7cbe39bfd2b64e3db289d053d240e16d814779b540"} Mar 22 00:29:49 crc kubenswrapper[5116]: I0322 00:29:49.883193 5116 generic.go:358] "Generic (PLEG): container finished" podID="80323414-c785-4e29-ac99-d15e78a522e6" containerID="cc649ae09f5f63aa5638af626fb6ba6a9bce42b912741ed45d606cb5dadab8f7" exitCode=0 Mar 22 00:29:49 crc kubenswrapper[5116]: I0322 00:29:49.883308 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerDied","Data":"cc649ae09f5f63aa5638af626fb6ba6a9bce42b912741ed45d606cb5dadab8f7"} Mar 22 00:29:49 crc kubenswrapper[5116]: I0322 00:29:49.950796 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_80323414-c785-4e29-ac99-d15e78a522e6/manage-dockerfile/0.log" Mar 22 00:29:50 crc kubenswrapper[5116]: I0322 00:29:50.892543 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerStarted","Data":"cc2dd2bd84c435afd43ea278dfbf8c6acf00d8d304d7a3b1991c1034d5caed81"} Mar 22 00:29:50 crc kubenswrapper[5116]: I0322 00:29:50.915357 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=4.915336728 podStartE2EDuration="4.915336728s" podCreationTimestamp="2026-03-22 00:29:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:29:50.914719279 +0000 UTC m=+1261.937020662" watchObservedRunningTime="2026-03-22 00:29:50.915336728 +0000 UTC m=+1261.937638101" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.129735 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568990-vx25q"] Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.136735 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5"] Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.136913 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.139702 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.141471 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.143515 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5"] Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.144051 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-config\"" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.144389 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operator-lifecycle-manager\"/\"collect-profiles-dockercfg-vfqp6\"" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.146772 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.146961 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.154333 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568990-vx25q"] Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.272196 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.272299 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7km4\" (UniqueName: \"kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.272503 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ww6v\" (UniqueName: \"kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v\") pod \"auto-csr-approver-29568990-vx25q\" (UID: \"753649c7-f19a-4b90-a29a-2108a691e934\") " pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.272663 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.373474 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7km4\" (UniqueName: \"kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.373569 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ww6v\" (UniqueName: \"kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v\") pod \"auto-csr-approver-29568990-vx25q\" (UID: \"753649c7-f19a-4b90-a29a-2108a691e934\") " pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.373611 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.373654 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.375218 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.387364 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.394016 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ww6v\" (UniqueName: \"kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v\") pod \"auto-csr-approver-29568990-vx25q\" (UID: \"753649c7-f19a-4b90-a29a-2108a691e934\") " pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.403699 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7km4\" (UniqueName: \"kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4\") pod \"collect-profiles-29568990-5fml5\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.458864 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.474963 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.681261 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5"] Mar 22 00:30:00 crc kubenswrapper[5116]: W0322 00:30:00.708512 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c0d10d_b2a7_4649_a456_658425b37334.slice/crio-883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a WatchSource:0}: Error finding container 883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a: Status 404 returned error can't find the container with id 883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.931069 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568990-vx25q"] Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.972773 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568990-vx25q" event={"ID":"753649c7-f19a-4b90-a29a-2108a691e934","Type":"ContainerStarted","Data":"3a1396751c56c487fc0af757cd9cc0665099170218de35e0c80420c8d61d1ed1"} Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.974047 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" event={"ID":"18c0d10d-b2a7-4649-a456-658425b37334","Type":"ContainerStarted","Data":"9fc9693d6cf3e606702f2ea7cd0f1963e72f90585a2a3981d32b3514955061d7"} Mar 22 00:30:00 crc kubenswrapper[5116]: I0322 00:30:00.974135 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" event={"ID":"18c0d10d-b2a7-4649-a456-658425b37334","Type":"ContainerStarted","Data":"883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a"} Mar 22 00:30:01 crc kubenswrapper[5116]: I0322 00:30:01.000401 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" podStartSLOduration=1.000376021 podStartE2EDuration="1.000376021s" podCreationTimestamp="2026-03-22 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:30:00.99620662 +0000 UTC m=+1272.018508023" watchObservedRunningTime="2026-03-22 00:30:01.000376021 +0000 UTC m=+1272.022677424" Mar 22 00:30:01 crc kubenswrapper[5116]: I0322 00:30:01.985234 5116 generic.go:358] "Generic (PLEG): container finished" podID="18c0d10d-b2a7-4649-a456-658425b37334" containerID="9fc9693d6cf3e606702f2ea7cd0f1963e72f90585a2a3981d32b3514955061d7" exitCode=0 Mar 22 00:30:01 crc kubenswrapper[5116]: I0322 00:30:01.985398 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" event={"ID":"18c0d10d-b2a7-4649-a456-658425b37334","Type":"ContainerDied","Data":"9fc9693d6cf3e606702f2ea7cd0f1963e72f90585a2a3981d32b3514955061d7"} Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.290656 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.414590 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume\") pod \"18c0d10d-b2a7-4649-a456-658425b37334\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.414687 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume\") pod \"18c0d10d-b2a7-4649-a456-658425b37334\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.415363 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7km4\" (UniqueName: \"kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4\") pod \"18c0d10d-b2a7-4649-a456-658425b37334\" (UID: \"18c0d10d-b2a7-4649-a456-658425b37334\") " Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.416788 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume" (OuterVolumeSpecName: "config-volume") pod "18c0d10d-b2a7-4649-a456-658425b37334" (UID: "18c0d10d-b2a7-4649-a456-658425b37334"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.422263 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4" (OuterVolumeSpecName: "kube-api-access-h7km4") pod "18c0d10d-b2a7-4649-a456-658425b37334" (UID: "18c0d10d-b2a7-4649-a456-658425b37334"). InnerVolumeSpecName "kube-api-access-h7km4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.426076 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "18c0d10d-b2a7-4649-a456-658425b37334" (UID: "18c0d10d-b2a7-4649-a456-658425b37334"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.517818 5116 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18c0d10d-b2a7-4649-a456-658425b37334-config-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.517857 5116 reconciler_common.go:299] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18c0d10d-b2a7-4649-a456-658425b37334-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:03 crc kubenswrapper[5116]: I0322 00:30:03.517869 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7km4\" (UniqueName: \"kubernetes.io/projected/18c0d10d-b2a7-4649-a456-658425b37334-kube-api-access-h7km4\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:04 crc kubenswrapper[5116]: I0322 00:30:04.001428 5116 generic.go:358] "Generic (PLEG): container finished" podID="753649c7-f19a-4b90-a29a-2108a691e934" containerID="85bd7d245d43142b219c3be5ac468eafb8ba3e6e6a34155393343c620d8b140b" exitCode=0 Mar 22 00:30:04 crc kubenswrapper[5116]: I0322 00:30:04.001540 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568990-vx25q" event={"ID":"753649c7-f19a-4b90-a29a-2108a691e934","Type":"ContainerDied","Data":"85bd7d245d43142b219c3be5ac468eafb8ba3e6e6a34155393343c620d8b140b"} Mar 22 00:30:04 crc kubenswrapper[5116]: I0322 00:30:04.003866 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" event={"ID":"18c0d10d-b2a7-4649-a456-658425b37334","Type":"ContainerDied","Data":"883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a"} Mar 22 00:30:04 crc kubenswrapper[5116]: I0322 00:30:04.003944 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="883ffff50b87983f260eca59092d5da7720f8d91e5a851c02748a363864cc02a" Mar 22 00:30:04 crc kubenswrapper[5116]: I0322 00:30:04.004078 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29568990-5fml5" Mar 22 00:30:05 crc kubenswrapper[5116]: I0322 00:30:05.275299 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:05 crc kubenswrapper[5116]: I0322 00:30:05.338260 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ww6v\" (UniqueName: \"kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v\") pod \"753649c7-f19a-4b90-a29a-2108a691e934\" (UID: \"753649c7-f19a-4b90-a29a-2108a691e934\") " Mar 22 00:30:05 crc kubenswrapper[5116]: I0322 00:30:05.348251 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v" (OuterVolumeSpecName: "kube-api-access-6ww6v") pod "753649c7-f19a-4b90-a29a-2108a691e934" (UID: "753649c7-f19a-4b90-a29a-2108a691e934"). InnerVolumeSpecName "kube-api-access-6ww6v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:30:05 crc kubenswrapper[5116]: I0322 00:30:05.439406 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6ww6v\" (UniqueName: \"kubernetes.io/projected/753649c7-f19a-4b90-a29a-2108a691e934-kube-api-access-6ww6v\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:06 crc kubenswrapper[5116]: I0322 00:30:06.017367 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568990-vx25q" event={"ID":"753649c7-f19a-4b90-a29a-2108a691e934","Type":"ContainerDied","Data":"3a1396751c56c487fc0af757cd9cc0665099170218de35e0c80420c8d61d1ed1"} Mar 22 00:30:06 crc kubenswrapper[5116]: I0322 00:30:06.018076 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a1396751c56c487fc0af757cd9cc0665099170218de35e0c80420c8d61d1ed1" Mar 22 00:30:06 crc kubenswrapper[5116]: I0322 00:30:06.017626 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568990-vx25q" Mar 22 00:30:06 crc kubenswrapper[5116]: I0322 00:30:06.338313 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568984-7vpl2"] Mar 22 00:30:06 crc kubenswrapper[5116]: I0322 00:30:06.343261 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568984-7vpl2"] Mar 22 00:30:07 crc kubenswrapper[5116]: I0322 00:30:07.706650 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd880bf8-6058-4924-8268-c4cdcd44bdcf" path="/var/lib/kubelet/pods/dd880bf8-6058-4924-8268-c4cdcd44bdcf/volumes" Mar 22 00:30:23 crc kubenswrapper[5116]: I0322 00:30:23.057244 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:30:23 crc kubenswrapper[5116]: I0322 00:30:23.057860 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:30:48 crc kubenswrapper[5116]: I0322 00:30:48.532596 5116 generic.go:358] "Generic (PLEG): container finished" podID="80323414-c785-4e29-ac99-d15e78a522e6" containerID="cc2dd2bd84c435afd43ea278dfbf8c6acf00d8d304d7a3b1991c1034d5caed81" exitCode=0 Mar 22 00:30:48 crc kubenswrapper[5116]: I0322 00:30:48.532646 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerDied","Data":"cc2dd2bd84c435afd43ea278dfbf8c6acf00d8d304d7a3b1991c1034d5caed81"} Mar 22 00:30:49 crc kubenswrapper[5116]: I0322 00:30:49.842880 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.017898 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.017976 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018012 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018034 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018097 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018143 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018242 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018326 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018355 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65nzh\" (UniqueName: \"kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018379 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018406 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018473 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push\") pod \"80323414-c785-4e29-ac99-d15e78a522e6\" (UID: \"80323414-c785-4e29-ac99-d15e78a522e6\") " Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018534 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.018597 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019013 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019037 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019278 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019681 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019707 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019721 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019735 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80323414-c785-4e29-ac99-d15e78a522e6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.019747 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/80323414-c785-4e29-ac99-d15e78a522e6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.020004 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.020610 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.025099 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh" (OuterVolumeSpecName: "kube-api-access-65nzh") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "kube-api-access-65nzh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.025334 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.025572 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.120581 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.120620 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.120629 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.120637 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65nzh\" (UniqueName: \"kubernetes.io/projected/80323414-c785-4e29-ac99-d15e78a522e6-kube-api-access-65nzh\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.120648 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/80323414-c785-4e29-ac99-d15e78a522e6-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.154382 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.222357 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.552180 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.552193 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"80323414-c785-4e29-ac99-d15e78a522e6","Type":"ContainerDied","Data":"a8d14e8cd61c30324bd428b5c66778dfcef4610119afccd6d0a42381bcfec935"} Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.552227 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8d14e8cd61c30324bd428b5c66778dfcef4610119afccd6d0a42381bcfec935" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.953492 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "80323414-c785-4e29-ac99-d15e78a522e6" (UID: "80323414-c785-4e29-ac99-d15e78a522e6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:30:50 crc kubenswrapper[5116]: I0322 00:30:50.992016 5116 scope.go:117] "RemoveContainer" containerID="c55d0c630755e42e331267ab717de759a85e99a7760f057cab2cc5e7dd612af4" Mar 22 00:30:51 crc kubenswrapper[5116]: I0322 00:30:51.034446 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/80323414-c785-4e29-ac99-d15e78a522e6-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:30:53 crc kubenswrapper[5116]: I0322 00:30:53.057772 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:30:53 crc kubenswrapper[5116]: I0322 00:30:53.057854 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.073386 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074392 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="git-clone" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074404 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="git-clone" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074422 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="manage-dockerfile" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074429 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="manage-dockerfile" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074469 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="753649c7-f19a-4b90-a29a-2108a691e934" containerName="oc" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074475 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="753649c7-f19a-4b90-a29a-2108a691e934" containerName="oc" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074498 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18c0d10d-b2a7-4649-a456-658425b37334" containerName="collect-profiles" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074507 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c0d10d-b2a7-4649-a456-658425b37334" containerName="collect-profiles" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074517 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="docker-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074523 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="docker-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074629 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="753649c7-f19a-4b90-a29a-2108a691e934" containerName="oc" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074650 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="80323414-c785-4e29-ac99-d15e78a522e6" containerName="docker-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.074659 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="18c0d10d-b2a7-4649-a456-658425b37334" containerName="collect-profiles" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.203620 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.203766 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.205443 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-1-global-ca\"" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.205476 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.206081 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-1-sys-config\"" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.206912 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-1-ca\"" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.240838 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.240878 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.240910 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.240934 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241026 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v96f7\" (UniqueName: \"kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241188 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241264 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241379 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241412 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241483 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241565 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.241610 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342635 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342691 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342710 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v96f7\" (UniqueName: \"kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342752 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342805 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.342930 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343073 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343106 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343120 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343150 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343245 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343295 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343343 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343380 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343670 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343766 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343800 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.343966 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.344038 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.344128 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.344560 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.348238 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.348618 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.361963 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v96f7\" (UniqueName: \"kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.517945 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:30:59 crc kubenswrapper[5116]: I0322 00:30:59.717306 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 22 00:30:59 crc kubenswrapper[5116]: W0322 00:30:59.723246 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod258a2fcf_8ff5_4c12_bf18_c6a2e8d3b125.slice/crio-ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf WatchSource:0}: Error finding container ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf: Status 404 returned error can't find the container with id ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf Mar 22 00:31:00 crc kubenswrapper[5116]: I0322 00:31:00.639986 5116 generic.go:358] "Generic (PLEG): container finished" podID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerID="4106dafe916c442ff6a6c84ef7c764001d5f6257e8b17fde82563cb3dcaa24f7" exitCode=0 Mar 22 00:31:00 crc kubenswrapper[5116]: I0322 00:31:00.640067 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125","Type":"ContainerDied","Data":"4106dafe916c442ff6a6c84ef7c764001d5f6257e8b17fde82563cb3dcaa24f7"} Mar 22 00:31:00 crc kubenswrapper[5116]: I0322 00:31:00.640449 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125","Type":"ContainerStarted","Data":"ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf"} Mar 22 00:31:01 crc kubenswrapper[5116]: I0322 00:31:01.651230 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125/docker-build/0.log" Mar 22 00:31:01 crc kubenswrapper[5116]: I0322 00:31:01.651828 5116 generic.go:358] "Generic (PLEG): container finished" podID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerID="9efa61524ace63b51ff35c7e96502efaaf7a080de1173648c97977985912b667" exitCode=1 Mar 22 00:31:01 crc kubenswrapper[5116]: I0322 00:31:01.651870 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125","Type":"ContainerDied","Data":"9efa61524ace63b51ff35c7e96502efaaf7a080de1173648c97977985912b667"} Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.931706 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125/docker-build/0.log" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.932153 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994702 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994790 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994829 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994871 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v96f7\" (UniqueName: \"kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994897 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994937 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.994981 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.995086 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.995119 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.995195 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.995244 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.995296 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs\") pod \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\" (UID: \"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125\") " Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996038 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996075 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996219 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996442 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996483 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996562 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996920 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.996969 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:02 crc kubenswrapper[5116]: I0322 00:31:02.997481 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.001087 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7" (OuterVolumeSpecName: "kube-api-access-v96f7") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "kube-api-access-v96f7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.001222 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.002301 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" (UID: "258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096476 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096516 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096525 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096535 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096548 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096557 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096565 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096573 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096581 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096591 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v96f7\" (UniqueName: \"kubernetes.io/projected/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-kube-api-access-v96f7\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096600 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.096612 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.669460 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125/docker-build/0.log" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.670226 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.670246 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125","Type":"ContainerDied","Data":"ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf"} Mar 22 00:31:03 crc kubenswrapper[5116]: I0322 00:31:03.670285 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddae9249247859a23357a1d34a4447a186c26fece6f5dafa5e2ee50ce8dc5bcf" Mar 22 00:31:09 crc kubenswrapper[5116]: I0322 00:31:09.820858 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 22 00:31:09 crc kubenswrapper[5116]: I0322 00:31:09.831152 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.396798 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.398640 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerName="manage-dockerfile" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.398798 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerName="manage-dockerfile" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.398939 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerName="docker-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.399048 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerName="docker-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.399444 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" containerName="docker-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.409620 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.412883 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-2-global-ca\"" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.412893 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-2-ca\"" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.413102 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-bundle-2-sys-config\"" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.413224 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.414049 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.515701 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.515767 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.515850 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.515900 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.515970 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h8km\" (UniqueName: \"kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516010 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516062 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516156 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516223 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516251 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516299 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.516326 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618143 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618257 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618320 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618371 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618404 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618478 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h8km\" (UniqueName: \"kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618519 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618559 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618606 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618649 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618688 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618779 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.618947 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.619385 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.619582 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.619921 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.619943 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.620288 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.620451 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.622057 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.622260 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.632030 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.632556 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.651603 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h8km\" (UniqueName: \"kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.706107 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125" path="/var/lib/kubelet/pods/258a2fcf-8ff5-4c12-bf18-c6a2e8d3b125/volumes" Mar 22 00:31:11 crc kubenswrapper[5116]: I0322 00:31:11.733884 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:12 crc kubenswrapper[5116]: I0322 00:31:12.001500 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 22 00:31:12 crc kubenswrapper[5116]: I0322 00:31:12.732354 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerStarted","Data":"6dfe2256b6ccd919ea4b14f777ef296f7562bf0cb1382ff9a867530fbfd779ff"} Mar 22 00:31:12 crc kubenswrapper[5116]: I0322 00:31:12.733380 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerStarted","Data":"66430cee706b285a6fea6a00a1645b2321d2adfb9b419b2649762ad243ebcf40"} Mar 22 00:31:13 crc kubenswrapper[5116]: I0322 00:31:13.740711 5116 generic.go:358] "Generic (PLEG): container finished" podID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerID="6dfe2256b6ccd919ea4b14f777ef296f7562bf0cb1382ff9a867530fbfd779ff" exitCode=0 Mar 22 00:31:13 crc kubenswrapper[5116]: I0322 00:31:13.740787 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerDied","Data":"6dfe2256b6ccd919ea4b14f777ef296f7562bf0cb1382ff9a867530fbfd779ff"} Mar 22 00:31:14 crc kubenswrapper[5116]: I0322 00:31:14.749190 5116 generic.go:358] "Generic (PLEG): container finished" podID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerID="a093665aa1e085f39ea150d88b2f17d5e4d29cdbc6776804e097d4c6ab856552" exitCode=0 Mar 22 00:31:14 crc kubenswrapper[5116]: I0322 00:31:14.749249 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerDied","Data":"a093665aa1e085f39ea150d88b2f17d5e4d29cdbc6776804e097d4c6ab856552"} Mar 22 00:31:14 crc kubenswrapper[5116]: I0322 00:31:14.784370 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_e4b4f564-3e34-47db-a558-376d32b6d7e3/manage-dockerfile/0.log" Mar 22 00:31:15 crc kubenswrapper[5116]: I0322 00:31:15.759925 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerStarted","Data":"c7e0bb0f8d6531da174daaada44701a4c5b4e858735065eb40a8d1dec71c974c"} Mar 22 00:31:15 crc kubenswrapper[5116]: I0322 00:31:15.795808 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=4.79578666 podStartE2EDuration="4.79578666s" podCreationTimestamp="2026-03-22 00:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:31:15.789241404 +0000 UTC m=+1346.811542787" watchObservedRunningTime="2026-03-22 00:31:15.79578666 +0000 UTC m=+1346.818088033" Mar 22 00:31:19 crc kubenswrapper[5116]: I0322 00:31:19.791026 5116 generic.go:358] "Generic (PLEG): container finished" podID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerID="c7e0bb0f8d6531da174daaada44701a4c5b4e858735065eb40a8d1dec71c974c" exitCode=0 Mar 22 00:31:19 crc kubenswrapper[5116]: I0322 00:31:19.791069 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerDied","Data":"c7e0bb0f8d6531da174daaada44701a4c5b4e858735065eb40a8d1dec71c974c"} Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.082246 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.155635 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h8km\" (UniqueName: \"kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.155769 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.155857 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.156400 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.156627 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.156952 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.156978 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.157056 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.157097 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.157117 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.157192 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.157216 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run\") pod \"e4b4f564-3e34-47db-a558-376d32b6d7e3\" (UID: \"e4b4f564-3e34-47db-a558-376d32b6d7e3\") " Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.158276 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.158294 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.158578 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.158702 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.159244 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.159404 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.159478 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.165631 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.166875 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.169928 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.170371 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.172054 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km" (OuterVolumeSpecName: "kube-api-access-8h8km") pod "e4b4f564-3e34-47db-a558-376d32b6d7e3" (UID: "e4b4f564-3e34-47db-a558-376d32b6d7e3"). InnerVolumeSpecName "kube-api-access-8h8km". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259417 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8h8km\" (UniqueName: \"kubernetes.io/projected/e4b4f564-3e34-47db-a558-376d32b6d7e3-kube-api-access-8h8km\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259457 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259472 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259483 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259495 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259507 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259518 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/e4b4f564-3e34-47db-a558-376d32b6d7e3-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259529 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259541 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e4b4f564-3e34-47db-a558-376d32b6d7e3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259553 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259565 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/e4b4f564-3e34-47db-a558-376d32b6d7e3-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.259580 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/e4b4f564-3e34-47db-a558-376d32b6d7e3-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.811208 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"e4b4f564-3e34-47db-a558-376d32b6d7e3","Type":"ContainerDied","Data":"66430cee706b285a6fea6a00a1645b2321d2adfb9b419b2649762ad243ebcf40"} Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.811502 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66430cee706b285a6fea6a00a1645b2321d2adfb9b419b2649762ad243ebcf40" Mar 22 00:31:21 crc kubenswrapper[5116]: I0322 00:31:21.811302 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.056841 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.056949 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.057015 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.057898 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.058016 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3" gracePeriod=600 Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.828475 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3" exitCode=0 Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.828543 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3"} Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.828939 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf"} Mar 22 00:31:23 crc kubenswrapper[5116]: I0322 00:31:23.828966 5116 scope.go:117] "RemoveContainer" containerID="b7eab597c02f83e56182c2492582a8c35b3583e68514ad1f9af8e387f2ee97bd" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.757624 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758360 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="git-clone" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758374 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="git-clone" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758396 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="manage-dockerfile" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758404 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="manage-dockerfile" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758424 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="docker-build" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758431 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="docker-build" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.758586 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4b4f564-3e34-47db-a558-376d32b6d7e3" containerName="docker-build" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.947951 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.948320 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.950945 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-1-ca\"" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.951037 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-1-sys-config\"" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.951378 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-1-global-ca\"" Mar 22 00:31:24 crc kubenswrapper[5116]: I0322 00:31:24.953621 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027049 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027116 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027314 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027513 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqnrq\" (UniqueName: \"kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027592 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027749 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027821 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.027987 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.028017 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.028095 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.028155 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.028201 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.129698 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.129750 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.129791 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.129997 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqnrq\" (UniqueName: \"kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130101 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130185 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130251 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130282 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130334 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130378 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130396 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130455 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130537 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130590 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130615 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130733 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130745 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.130845 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.131093 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.131099 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.131209 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.138484 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.139231 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.147970 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqnrq\" (UniqueName: \"kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.268708 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.706934 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 22 00:31:25 crc kubenswrapper[5116]: W0322 00:31:25.710900 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d404dae_83c3_4875_8b37_3240f8a35259.slice/crio-57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7 WatchSource:0}: Error finding container 57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7: Status 404 returned error can't find the container with id 57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7 Mar 22 00:31:25 crc kubenswrapper[5116]: I0322 00:31:25.853216 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"5d404dae-83c3-4875-8b37-3240f8a35259","Type":"ContainerStarted","Data":"57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7"} Mar 22 00:31:26 crc kubenswrapper[5116]: I0322 00:31:26.861962 5116 generic.go:358] "Generic (PLEG): container finished" podID="5d404dae-83c3-4875-8b37-3240f8a35259" containerID="095ee6b2ca6fc427f5a465f41b63b9951a947f316d689c28f653b52df20fd554" exitCode=0 Mar 22 00:31:26 crc kubenswrapper[5116]: I0322 00:31:26.862097 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"5d404dae-83c3-4875-8b37-3240f8a35259","Type":"ContainerDied","Data":"095ee6b2ca6fc427f5a465f41b63b9951a947f316d689c28f653b52df20fd554"} Mar 22 00:31:27 crc kubenswrapper[5116]: I0322 00:31:27.874532 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_5d404dae-83c3-4875-8b37-3240f8a35259/docker-build/0.log" Mar 22 00:31:27 crc kubenswrapper[5116]: I0322 00:31:27.875514 5116 generic.go:358] "Generic (PLEG): container finished" podID="5d404dae-83c3-4875-8b37-3240f8a35259" containerID="2a6933fce9cc74d356d1d5a231d547fdb2259fd5ca76515682cf2f100750a7ff" exitCode=1 Mar 22 00:31:27 crc kubenswrapper[5116]: I0322 00:31:27.875629 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"5d404dae-83c3-4875-8b37-3240f8a35259","Type":"ContainerDied","Data":"2a6933fce9cc74d356d1d5a231d547fdb2259fd5ca76515682cf2f100750a7ff"} Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.141705 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_5d404dae-83c3-4875-8b37-3240f8a35259/docker-build/0.log" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.142484 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287294 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287374 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287407 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287488 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287618 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqnrq\" (UniqueName: \"kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287642 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287664 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287714 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287805 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287836 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287903 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.287944 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push\") pod \"5d404dae-83c3-4875-8b37-3240f8a35259\" (UID: \"5d404dae-83c3-4875-8b37-3240f8a35259\") " Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.288259 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.288581 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.288604 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.288643 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.288717 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.289113 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.289646 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.290590 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.292504 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.296030 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq" (OuterVolumeSpecName: "kube-api-access-zqnrq") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "kube-api-access-zqnrq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.297033 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.302195 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "5d404dae-83c3-4875-8b37-3240f8a35259" (UID: "5d404dae-83c3-4875-8b37-3240f8a35259"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389070 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zqnrq\" (UniqueName: \"kubernetes.io/projected/5d404dae-83c3-4875-8b37-3240f8a35259-kube-api-access-zqnrq\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389120 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389136 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389146 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389158 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389190 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5d404dae-83c3-4875-8b37-3240f8a35259-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389201 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389212 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389223 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d404dae-83c3-4875-8b37-3240f8a35259-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389233 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389245 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/5d404dae-83c3-4875-8b37-3240f8a35259-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.389255 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5d404dae-83c3-4875-8b37-3240f8a35259-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.890089 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_5d404dae-83c3-4875-8b37-3240f8a35259/docker-build/0.log" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.890795 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.890803 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"5d404dae-83c3-4875-8b37-3240f8a35259","Type":"ContainerDied","Data":"57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7"} Mar 22 00:31:29 crc kubenswrapper[5116]: I0322 00:31:29.890868 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57827d37053f3315c5c6bcc6af9143071581dab86aae5c8f52624b7c138e80e7" Mar 22 00:31:35 crc kubenswrapper[5116]: I0322 00:31:35.283813 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 22 00:31:35 crc kubenswrapper[5116]: I0322 00:31:35.293403 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 22 00:31:35 crc kubenswrapper[5116]: I0322 00:31:35.707697 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" path="/var/lib/kubelet/pods/5d404dae-83c3-4875-8b37-3240f8a35259/volumes" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.887442 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.888411 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" containerName="manage-dockerfile" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.888429 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" containerName="manage-dockerfile" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.888444 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" containerName="docker-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.888452 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" containerName="docker-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.888617 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d404dae-83c3-4875-8b37-3240f8a35259" containerName="docker-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.903264 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.907106 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-2-ca\"" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.907153 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-2-global-ca\"" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.907112 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.907955 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-bundle-2-sys-config\"" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.919727 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.989922 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.989998 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990038 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990094 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990115 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990196 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990282 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990399 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990425 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990545 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6fhl\" (UniqueName: \"kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990596 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:36 crc kubenswrapper[5116]: I0322 00:31:36.990654 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.092417 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6fhl\" (UniqueName: \"kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.092461 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.092481 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.092665 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093701 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093734 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093522 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093869 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093919 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093947 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093965 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094011 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.093265 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094034 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094043 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094081 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094054 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094241 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094332 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094479 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.094771 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.100061 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.100065 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.109490 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6fhl\" (UniqueName: \"kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.220514 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.461024 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.955669 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerStarted","Data":"8b809c921fad586c32a2fe64225aefd18ac9e19113f917fa44e139134998184b"} Mar 22 00:31:37 crc kubenswrapper[5116]: I0322 00:31:37.956065 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerStarted","Data":"c86a8ab826af7ddc8585f7822b5a9d7c3603da327e396daa6ea858ea8f24b95e"} Mar 22 00:31:38 crc kubenswrapper[5116]: I0322 00:31:38.987991 5116 generic.go:358] "Generic (PLEG): container finished" podID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerID="8b809c921fad586c32a2fe64225aefd18ac9e19113f917fa44e139134998184b" exitCode=0 Mar 22 00:31:38 crc kubenswrapper[5116]: I0322 00:31:38.988044 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerDied","Data":"8b809c921fad586c32a2fe64225aefd18ac9e19113f917fa44e139134998184b"} Mar 22 00:31:40 crc kubenswrapper[5116]: I0322 00:31:40.000584 5116 generic.go:358] "Generic (PLEG): container finished" podID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerID="b4960de0e26fafe9b64394aff120e75e95fa9e842b754124cc9dedd6c2af58c4" exitCode=0 Mar 22 00:31:40 crc kubenswrapper[5116]: I0322 00:31:40.000807 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerDied","Data":"b4960de0e26fafe9b64394aff120e75e95fa9e842b754124cc9dedd6c2af58c4"} Mar 22 00:31:40 crc kubenswrapper[5116]: I0322 00:31:40.044389 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_af329613-40e2-4658-86b9-a1d39a51a9ac/manage-dockerfile/0.log" Mar 22 00:31:41 crc kubenswrapper[5116]: I0322 00:31:41.012453 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerStarted","Data":"065c9d4f1db0c26993215352e7cce4dde4864d6117683d62a0e4adac61c43f63"} Mar 22 00:31:41 crc kubenswrapper[5116]: I0322 00:31:41.038088 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=5.038071404 podStartE2EDuration="5.038071404s" podCreationTimestamp="2026-03-22 00:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:31:41.03541997 +0000 UTC m=+1372.057721343" watchObservedRunningTime="2026-03-22 00:31:41.038071404 +0000 UTC m=+1372.060372777" Mar 22 00:31:46 crc kubenswrapper[5116]: I0322 00:31:46.051139 5116 generic.go:358] "Generic (PLEG): container finished" podID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerID="065c9d4f1db0c26993215352e7cce4dde4864d6117683d62a0e4adac61c43f63" exitCode=0 Mar 22 00:31:46 crc kubenswrapper[5116]: I0322 00:31:46.051214 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerDied","Data":"065c9d4f1db0c26993215352e7cce4dde4864d6117683d62a0e4adac61c43f63"} Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.329912 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441643 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441683 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441714 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441760 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441836 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6fhl\" (UniqueName: \"kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441907 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.441934 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.442019 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.442075 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.442397 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443012 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443002 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443467 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443569 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.442100 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443656 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443684 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.443845 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.444769 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.444836 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push\") pod \"af329613-40e2-4658-86b9-a1d39a51a9ac\" (UID: \"af329613-40e2-4658-86b9-a1d39a51a9ac\") " Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445462 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445525 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445538 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445565 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445578 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/af329613-40e2-4658-86b9-a1d39a51a9ac-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445589 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445602 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.445613 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/af329613-40e2-4658-86b9-a1d39a51a9ac-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.448687 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.453893 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl" (OuterVolumeSpecName: "kube-api-access-b6fhl") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "kube-api-access-b6fhl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.453897 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.456895 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "af329613-40e2-4658-86b9-a1d39a51a9ac" (UID: "af329613-40e2-4658-86b9-a1d39a51a9ac"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.546489 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.546521 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/af329613-40e2-4658-86b9-a1d39a51a9ac-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.546532 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/af329613-40e2-4658-86b9-a1d39a51a9ac-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:47 crc kubenswrapper[5116]: I0322 00:31:47.546541 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6fhl\" (UniqueName: \"kubernetes.io/projected/af329613-40e2-4658-86b9-a1d39a51a9ac-kube-api-access-b6fhl\") on node \"crc\" DevicePath \"\"" Mar 22 00:31:48 crc kubenswrapper[5116]: I0322 00:31:48.073953 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"af329613-40e2-4658-86b9-a1d39a51a9ac","Type":"ContainerDied","Data":"c86a8ab826af7ddc8585f7822b5a9d7c3603da327e396daa6ea858ea8f24b95e"} Mar 22 00:31:48 crc kubenswrapper[5116]: I0322 00:31:48.073987 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c86a8ab826af7ddc8585f7822b5a9d7c3603da327e396daa6ea858ea8f24b95e" Mar 22 00:31:48 crc kubenswrapper[5116]: I0322 00:31:48.074059 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.132488 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568992-2pdr9"] Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133701 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="manage-dockerfile" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133715 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="manage-dockerfile" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133735 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="docker-build" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133741 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="docker-build" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133757 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="git-clone" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133763 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="git-clone" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.133866 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="af329613-40e2-4658-86b9-a1d39a51a9ac" containerName="docker-build" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.141088 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.143687 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.144067 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.145422 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568992-2pdr9"] Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.149865 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.220740 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk6s4\" (UniqueName: \"kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4\") pod \"auto-csr-approver-29568992-2pdr9\" (UID: \"6a9e5029-c0b6-4edb-a790-d67f5791bab2\") " pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.322010 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fk6s4\" (UniqueName: \"kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4\") pod \"auto-csr-approver-29568992-2pdr9\" (UID: \"6a9e5029-c0b6-4edb-a790-d67f5791bab2\") " pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.342028 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk6s4\" (UniqueName: \"kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4\") pod \"auto-csr-approver-29568992-2pdr9\" (UID: \"6a9e5029-c0b6-4edb-a790-d67f5791bab2\") " pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.466946 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:00 crc kubenswrapper[5116]: I0322 00:32:00.678062 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568992-2pdr9"] Mar 22 00:32:01 crc kubenswrapper[5116]: I0322 00:32:01.179729 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" event={"ID":"6a9e5029-c0b6-4edb-a790-d67f5791bab2","Type":"ContainerStarted","Data":"107d510fc55bcf7ed0d7f1994bf108101a66973dfce548697cc10916b771397e"} Mar 22 00:32:02 crc kubenswrapper[5116]: I0322 00:32:02.187345 5116 generic.go:358] "Generic (PLEG): container finished" podID="6a9e5029-c0b6-4edb-a790-d67f5791bab2" containerID="b895d59dbbcf506c3c0f496201be78280c2900728c487398d81316e12e931fac" exitCode=0 Mar 22 00:32:02 crc kubenswrapper[5116]: I0322 00:32:02.187436 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" event={"ID":"6a9e5029-c0b6-4edb-a790-d67f5791bab2","Type":"ContainerDied","Data":"b895d59dbbcf506c3c0f496201be78280c2900728c487398d81316e12e931fac"} Mar 22 00:32:03 crc kubenswrapper[5116]: I0322 00:32:03.441850 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:03 crc kubenswrapper[5116]: I0322 00:32:03.567021 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk6s4\" (UniqueName: \"kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4\") pod \"6a9e5029-c0b6-4edb-a790-d67f5791bab2\" (UID: \"6a9e5029-c0b6-4edb-a790-d67f5791bab2\") " Mar 22 00:32:03 crc kubenswrapper[5116]: I0322 00:32:03.576228 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4" (OuterVolumeSpecName: "kube-api-access-fk6s4") pod "6a9e5029-c0b6-4edb-a790-d67f5791bab2" (UID: "6a9e5029-c0b6-4edb-a790-d67f5791bab2"). InnerVolumeSpecName "kube-api-access-fk6s4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:32:03 crc kubenswrapper[5116]: I0322 00:32:03.669889 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fk6s4\" (UniqueName: \"kubernetes.io/projected/6a9e5029-c0b6-4edb-a790-d67f5791bab2-kube-api-access-fk6s4\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.210864 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" event={"ID":"6a9e5029-c0b6-4edb-a790-d67f5791bab2","Type":"ContainerDied","Data":"107d510fc55bcf7ed0d7f1994bf108101a66973dfce548697cc10916b771397e"} Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.211267 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="107d510fc55bcf7ed0d7f1994bf108101a66973dfce548697cc10916b771397e" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.211136 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568992-2pdr9" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.387621 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.388272 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a9e5029-c0b6-4edb-a790-d67f5791bab2" containerName="oc" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.388288 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9e5029-c0b6-4edb-a790-d67f5791bab2" containerName="oc" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.388379 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a9e5029-c0b6-4edb-a790-d67f5791bab2" containerName="oc" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.393532 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.395462 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"service-telemetry-framework-index-dockercfg\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.395461 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-framework-index-1-sys-config\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.395982 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"builder-dockercfg-qv5f4\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.396223 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-framework-index-1-ca\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.396891 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"service-telemetry-framework-index-1-global-ca\"" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.410832 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481221 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j8rm\" (UniqueName: \"kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481284 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481312 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481400 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481441 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481540 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481590 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481641 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481673 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481697 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481821 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481887 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.481948 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.495845 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568986-l778x"] Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.502136 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568986-l778x"] Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.582993 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583035 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583055 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583086 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583106 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583122 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583167 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8rm\" (UniqueName: \"kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583254 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583434 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583470 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583526 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583568 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583636 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583744 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.583823 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584205 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584252 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584318 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584324 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584375 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584446 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.584571 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.587717 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.587945 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.590987 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.602398 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j8rm\" (UniqueName: \"kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm\") pod \"service-telemetry-framework-index-1-build\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.711887 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:04 crc kubenswrapper[5116]: I0322 00:32:04.919431 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 22 00:32:04 crc kubenswrapper[5116]: W0322 00:32:04.928263 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc31bf437_0e3b_460f_ba0c_4b172f455201.slice/crio-2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c WatchSource:0}: Error finding container 2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c: Status 404 returned error can't find the container with id 2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c Mar 22 00:32:05 crc kubenswrapper[5116]: I0322 00:32:05.216765 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerStarted","Data":"2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c"} Mar 22 00:32:05 crc kubenswrapper[5116]: I0322 00:32:05.704062 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d42dbb69-e840-4b6a-b719-52396f82919e" path="/var/lib/kubelet/pods/d42dbb69-e840-4b6a-b719-52396f82919e/volumes" Mar 22 00:32:06 crc kubenswrapper[5116]: I0322 00:32:06.229116 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerStarted","Data":"b93bd1618b4a88d7aa23a73088f70123e675abc6fffcdbe560999c59ba33e7db"} Mar 22 00:32:07 crc kubenswrapper[5116]: I0322 00:32:07.240802 5116 generic.go:358] "Generic (PLEG): container finished" podID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerID="b93bd1618b4a88d7aa23a73088f70123e675abc6fffcdbe560999c59ba33e7db" exitCode=0 Mar 22 00:32:07 crc kubenswrapper[5116]: I0322 00:32:07.240878 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerDied","Data":"b93bd1618b4a88d7aa23a73088f70123e675abc6fffcdbe560999c59ba33e7db"} Mar 22 00:32:08 crc kubenswrapper[5116]: I0322 00:32:08.250102 5116 generic.go:358] "Generic (PLEG): container finished" podID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerID="3230f47f645627d0982892f22f2beeaf875033c794a1e3775aa92d829e7da434" exitCode=0 Mar 22 00:32:08 crc kubenswrapper[5116]: I0322 00:32:08.250240 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerDied","Data":"3230f47f645627d0982892f22f2beeaf875033c794a1e3775aa92d829e7da434"} Mar 22 00:32:08 crc kubenswrapper[5116]: I0322 00:32:08.287498 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_c31bf437-0e3b-460f-ba0c-4b172f455201/manage-dockerfile/0.log" Mar 22 00:32:09 crc kubenswrapper[5116]: I0322 00:32:09.260317 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerStarted","Data":"74a06b5b226c72a9a8d2bcbdf47193630da4297aa507d3bbc891de1064f2eb18"} Mar 22 00:32:09 crc kubenswrapper[5116]: I0322 00:32:09.299887 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=5.299867097 podStartE2EDuration="5.299867097s" podCreationTimestamp="2026-03-22 00:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:32:09.291134762 +0000 UTC m=+1400.313436155" watchObservedRunningTime="2026-03-22 00:32:09.299867097 +0000 UTC m=+1400.322168480" Mar 22 00:32:47 crc kubenswrapper[5116]: I0322 00:32:47.539816 5116 generic.go:358] "Generic (PLEG): container finished" podID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerID="74a06b5b226c72a9a8d2bcbdf47193630da4297aa507d3bbc891de1064f2eb18" exitCode=0 Mar 22 00:32:47 crc kubenswrapper[5116]: I0322 00:32:47.539933 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerDied","Data":"74a06b5b226c72a9a8d2bcbdf47193630da4297aa507d3bbc891de1064f2eb18"} Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.815281 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.929770 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.929843 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.929865 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930142 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930277 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930370 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930451 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930525 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930633 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930728 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930853 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930967 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j8rm\" (UniqueName: \"kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.931041 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root\") pod \"c31bf437-0e3b-460f-ba0c-4b172f455201\" (UID: \"c31bf437-0e3b-460f-ba0c-4b172f455201\") " Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930853 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930859 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930889 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930884 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.930911 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.931715 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.932549 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.936868 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.936944 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm" (OuterVolumeSpecName: "kube-api-access-6j8rm") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "kube-api-access-6j8rm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.937007 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-pull") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "builder-dockercfg-qv5f4-pull". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:32:48 crc kubenswrapper[5116]: I0322 00:32:48.937991 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push" (OuterVolumeSpecName: "builder-dockercfg-qv5f4-push") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "builder-dockercfg-qv5f4-push". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032591 5116 reconciler_common.go:299] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032626 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-push\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-push\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032640 5116 reconciler_common.go:299] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032653 5116 reconciler_common.go:299] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032665 5116 reconciler_common.go:299] "Volume detached for volume \"builder-dockercfg-qv5f4-pull\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-builder-dockercfg-qv5f4-pull\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032676 5116 reconciler_common.go:299] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032686 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032696 5116 reconciler_common.go:299] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/c31bf437-0e3b-460f-ba0c-4b172f455201-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032723 5116 reconciler_common.go:299] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c31bf437-0e3b-460f-ba0c-4b172f455201-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032737 5116 reconciler_common.go:299] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/c31bf437-0e3b-460f-ba0c-4b172f455201-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.032752 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6j8rm\" (UniqueName: \"kubernetes.io/projected/c31bf437-0e3b-460f-ba0c-4b172f455201-kube-api-access-6j8rm\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.207291 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.236124 5116 reconciler_common.go:299] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.558000 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"c31bf437-0e3b-460f-ba0c-4b172f455201","Type":"ContainerDied","Data":"2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c"} Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.558043 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e1b040ee59464ba98cbd13a35e5abb1f7ad61dfebbd95563755f14726edf22c" Mar 22 00:32:49 crc kubenswrapper[5116]: I0322 00:32:49.558053 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.114128 5116 scope.go:117] "RemoveContainer" containerID="795e259e79e05542df53f912755d650098dd25713e478a8d61c158fc1da118ef" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.227567 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "c31bf437-0e3b-460f-ba0c-4b172f455201" (UID: "c31bf437-0e3b-460f-ba0c-4b172f455201"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.268318 5116 reconciler_common.go:299] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/c31bf437-0e3b-460f-ba0c-4b172f455201-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.840303 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.841921 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="manage-dockerfile" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.842023 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="manage-dockerfile" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.842100 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="docker-build" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.843130 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="docker-build" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.843276 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="git-clone" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.843357 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="git-clone" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.844705 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="c31bf437-0e3b-460f-ba0c-4b172f455201" containerName="docker-build" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.873372 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.873603 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.876653 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"infrawatch-operators-dockercfg-mwrhj\"" Mar 22 00:32:51 crc kubenswrapper[5116]: I0322 00:32:51.977735 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vps4q\" (UniqueName: \"kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q\") pod \"infrawatch-operators-pbfzj\" (UID: \"79f55d1d-3c22-43be-84f4-9cef2bb268f0\") " pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:32:52 crc kubenswrapper[5116]: I0322 00:32:52.078919 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vps4q\" (UniqueName: \"kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q\") pod \"infrawatch-operators-pbfzj\" (UID: \"79f55d1d-3c22-43be-84f4-9cef2bb268f0\") " pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:32:52 crc kubenswrapper[5116]: I0322 00:32:52.105671 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vps4q\" (UniqueName: \"kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q\") pod \"infrawatch-operators-pbfzj\" (UID: \"79f55d1d-3c22-43be-84f4-9cef2bb268f0\") " pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:32:52 crc kubenswrapper[5116]: I0322 00:32:52.196751 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:32:52 crc kubenswrapper[5116]: I0322 00:32:52.647897 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:32:52 crc kubenswrapper[5116]: W0322 00:32:52.663471 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f55d1d_3c22_43be_84f4_9cef2bb268f0.slice/crio-03d28f35ce77fb34c75f7781589e64a095acb527fdbeda78c9d0c617287e2d1f WatchSource:0}: Error finding container 03d28f35ce77fb34c75f7781589e64a095acb527fdbeda78c9d0c617287e2d1f: Status 404 returned error can't find the container with id 03d28f35ce77fb34c75f7781589e64a095acb527fdbeda78c9d0c617287e2d1f Mar 22 00:32:53 crc kubenswrapper[5116]: I0322 00:32:53.583898 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-pbfzj" event={"ID":"79f55d1d-3c22-43be-84f4-9cef2bb268f0","Type":"ContainerStarted","Data":"03d28f35ce77fb34c75f7781589e64a095acb527fdbeda78c9d0c617287e2d1f"} Mar 22 00:32:57 crc kubenswrapper[5116]: I0322 00:32:57.233073 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.049050 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-g5xjt"] Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.507694 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-g5xjt"] Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.507828 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.569903 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjpx8\" (UniqueName: \"kubernetes.io/projected/8ac45e34-89d0-4eb0-b696-51b35b33b23e-kube-api-access-cjpx8\") pod \"infrawatch-operators-g5xjt\" (UID: \"8ac45e34-89d0-4eb0-b696-51b35b33b23e\") " pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.670700 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjpx8\" (UniqueName: \"kubernetes.io/projected/8ac45e34-89d0-4eb0-b696-51b35b33b23e-kube-api-access-cjpx8\") pod \"infrawatch-operators-g5xjt\" (UID: \"8ac45e34-89d0-4eb0-b696-51b35b33b23e\") " pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.710756 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjpx8\" (UniqueName: \"kubernetes.io/projected/8ac45e34-89d0-4eb0-b696-51b35b33b23e-kube-api-access-cjpx8\") pod \"infrawatch-operators-g5xjt\" (UID: \"8ac45e34-89d0-4eb0-b696-51b35b33b23e\") " pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:32:58 crc kubenswrapper[5116]: I0322 00:32:58.826665 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.245929 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-g5xjt"] Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.640874 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-pbfzj" event={"ID":"79f55d1d-3c22-43be-84f4-9cef2bb268f0","Type":"ContainerStarted","Data":"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb"} Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.640996 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-pbfzj" podUID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" containerName="registry-server" containerID="cri-o://adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb" gracePeriod=2 Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.645106 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-g5xjt" event={"ID":"8ac45e34-89d0-4eb0-b696-51b35b33b23e","Type":"ContainerStarted","Data":"57983bd1879f74e47a1934e8861f59b2e62c326b1718640ebba28c521ad77e3c"} Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.645203 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-g5xjt" event={"ID":"8ac45e34-89d0-4eb0-b696-51b35b33b23e","Type":"ContainerStarted","Data":"e1ba8e236edf3c272fdce84bd6bc25926a88c4eaa660cb177b22b2d29a8c98b2"} Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.656605 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-pbfzj" podStartSLOduration=2.149997514 podStartE2EDuration="11.656581926s" podCreationTimestamp="2026-03-22 00:32:51 +0000 UTC" firstStartedPulling="2026-03-22 00:32:52.66542245 +0000 UTC m=+1443.687723843" lastFinishedPulling="2026-03-22 00:33:02.172006872 +0000 UTC m=+1453.194308255" observedRunningTime="2026-03-22 00:33:02.654393627 +0000 UTC m=+1453.676695010" watchObservedRunningTime="2026-03-22 00:33:02.656581926 +0000 UTC m=+1453.678883299" Mar 22 00:33:02 crc kubenswrapper[5116]: I0322 00:33:02.675307 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-g5xjt" podStartSLOduration=4.565823842 podStartE2EDuration="4.675273902s" podCreationTimestamp="2026-03-22 00:32:58 +0000 UTC" firstStartedPulling="2026-03-22 00:33:02.258673287 +0000 UTC m=+1453.280974670" lastFinishedPulling="2026-03-22 00:33:02.368123357 +0000 UTC m=+1453.390424730" observedRunningTime="2026-03-22 00:33:02.670792761 +0000 UTC m=+1453.693094164" watchObservedRunningTime="2026-03-22 00:33:02.675273902 +0000 UTC m=+1453.697575315" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.184542 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.230446 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vps4q\" (UniqueName: \"kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q\") pod \"79f55d1d-3c22-43be-84f4-9cef2bb268f0\" (UID: \"79f55d1d-3c22-43be-84f4-9cef2bb268f0\") " Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.237488 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q" (OuterVolumeSpecName: "kube-api-access-vps4q") pod "79f55d1d-3c22-43be-84f4-9cef2bb268f0" (UID: "79f55d1d-3c22-43be-84f4-9cef2bb268f0"). InnerVolumeSpecName "kube-api-access-vps4q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.333253 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vps4q\" (UniqueName: \"kubernetes.io/projected/79f55d1d-3c22-43be-84f4-9cef2bb268f0-kube-api-access-vps4q\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.663384 5116 generic.go:358] "Generic (PLEG): container finished" podID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" containerID="adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb" exitCode=0 Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.663595 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-pbfzj" event={"ID":"79f55d1d-3c22-43be-84f4-9cef2bb268f0","Type":"ContainerDied","Data":"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb"} Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.663659 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-pbfzj" event={"ID":"79f55d1d-3c22-43be-84f4-9cef2bb268f0","Type":"ContainerDied","Data":"03d28f35ce77fb34c75f7781589e64a095acb527fdbeda78c9d0c617287e2d1f"} Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.663690 5116 scope.go:117] "RemoveContainer" containerID="adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.663567 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-pbfzj" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.695624 5116 scope.go:117] "RemoveContainer" containerID="adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb" Mar 22 00:33:03 crc kubenswrapper[5116]: E0322 00:33:03.696139 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb\": container with ID starting with adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb not found: ID does not exist" containerID="adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.696229 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb"} err="failed to get container status \"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb\": rpc error: code = NotFound desc = could not find container \"adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb\": container with ID starting with adf459e10bd634af861991406392ca3f3cc87c5dca73ec4feb3e7454496bf6bb not found: ID does not exist" Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.726790 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:33:03 crc kubenswrapper[5116]: I0322 00:33:03.737321 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-pbfzj"] Mar 22 00:33:03 crc kubenswrapper[5116]: E0322 00:33:03.833812 5116 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79f55d1d_3c22_43be_84f4_9cef2bb268f0.slice\": RecentStats: unable to find data in memory cache]" Mar 22 00:33:05 crc kubenswrapper[5116]: I0322 00:33:05.706153 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" path="/var/lib/kubelet/pods/79f55d1d-3c22-43be-84f4-9cef2bb268f0/volumes" Mar 22 00:33:08 crc kubenswrapper[5116]: I0322 00:33:08.827311 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:33:08 crc kubenswrapper[5116]: I0322 00:33:08.827703 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:33:08 crc kubenswrapper[5116]: I0322 00:33:08.862703 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:33:09 crc kubenswrapper[5116]: I0322 00:33:09.767374 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-g5xjt" Mar 22 00:33:23 crc kubenswrapper[5116]: I0322 00:33:23.057494 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:33:23 crc kubenswrapper[5116]: I0322 00:33:23.057979 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.502524 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq"] Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.505040 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" containerName="registry-server" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.505228 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" containerName="registry-server" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.505616 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="79f55d1d-3c22-43be-84f4-9cef2bb268f0" containerName="registry-server" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.517788 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq"] Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.517972 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.543274 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.543383 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjspz\" (UniqueName: \"kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.543527 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.645249 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.645384 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.645445 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjspz\" (UniqueName: \"kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.645751 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.645826 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.666074 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjspz\" (UniqueName: \"kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:25 crc kubenswrapper[5116]: I0322 00:33:25.863604 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.088523 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq"] Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.297204 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph"] Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.318553 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph"] Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.318839 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.357512 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.357593 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn226\" (UniqueName: \"kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.357669 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.458594 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.458675 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.458729 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn226\" (UniqueName: \"kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.459150 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.459254 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.481360 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn226\" (UniqueName: \"kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.632480 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.860432 5116 generic.go:358] "Generic (PLEG): container finished" podID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerID="b474626d7ff0ec5a95ee881e6ec583483802dd233c57a9ca2ca5510d36b5d845" exitCode=0 Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.860489 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" event={"ID":"4fc23f04-b941-4ba1-877b-076f0f27569a","Type":"ContainerDied","Data":"b474626d7ff0ec5a95ee881e6ec583483802dd233c57a9ca2ca5510d36b5d845"} Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.860845 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" event={"ID":"4fc23f04-b941-4ba1-877b-076f0f27569a","Type":"ContainerStarted","Data":"a87780d2b4bac827afdf7443e6daa4e225ba7f3f20729ab614d7fe7e816d18bf"} Mar 22 00:33:26 crc kubenswrapper[5116]: I0322 00:33:26.916935 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph"] Mar 22 00:33:26 crc kubenswrapper[5116]: W0322 00:33:26.921592 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02e0ed20_34bb_4aa9_a4c8_89dcbd6c3849.slice/crio-c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326 WatchSource:0}: Error finding container c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326: Status 404 returned error can't find the container with id c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326 Mar 22 00:33:27 crc kubenswrapper[5116]: I0322 00:33:27.872038 5116 generic.go:358] "Generic (PLEG): container finished" podID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerID="3f4e40260b8dbda06424c155eee70b7f1a5cb6d29b68c5b72bee79ab8f75cd36" exitCode=0 Mar 22 00:33:27 crc kubenswrapper[5116]: I0322 00:33:27.872096 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" event={"ID":"4fc23f04-b941-4ba1-877b-076f0f27569a","Type":"ContainerDied","Data":"3f4e40260b8dbda06424c155eee70b7f1a5cb6d29b68c5b72bee79ab8f75cd36"} Mar 22 00:33:27 crc kubenswrapper[5116]: I0322 00:33:27.874617 5116 generic.go:358] "Generic (PLEG): container finished" podID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerID="d07efa1469944ab7ce5272e63bd6eb423b16d83caaed4ab51b9ddd7648ab0ace" exitCode=0 Mar 22 00:33:27 crc kubenswrapper[5116]: I0322 00:33:27.874671 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" event={"ID":"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849","Type":"ContainerDied","Data":"d07efa1469944ab7ce5272e63bd6eb423b16d83caaed4ab51b9ddd7648ab0ace"} Mar 22 00:33:27 crc kubenswrapper[5116]: I0322 00:33:27.874695 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" event={"ID":"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849","Type":"ContainerStarted","Data":"c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326"} Mar 22 00:33:28 crc kubenswrapper[5116]: I0322 00:33:28.886836 5116 generic.go:358] "Generic (PLEG): container finished" podID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerID="763c994a37e899cabf039fd393317e3c5c41cfac5090cae3eb617ae14452b664" exitCode=0 Mar 22 00:33:28 crc kubenswrapper[5116]: I0322 00:33:28.886964 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" event={"ID":"4fc23f04-b941-4ba1-877b-076f0f27569a","Type":"ContainerDied","Data":"763c994a37e899cabf039fd393317e3c5c41cfac5090cae3eb617ae14452b664"} Mar 22 00:33:28 crc kubenswrapper[5116]: I0322 00:33:28.890667 5116 generic.go:358] "Generic (PLEG): container finished" podID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerID="21f8b92f22c044dccc23a9859663e13d5fdfdc8158900fe15d4d76809e28bf7a" exitCode=0 Mar 22 00:33:28 crc kubenswrapper[5116]: I0322 00:33:28.890714 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" event={"ID":"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849","Type":"ContainerDied","Data":"21f8b92f22c044dccc23a9859663e13d5fdfdc8158900fe15d4d76809e28bf7a"} Mar 22 00:33:29 crc kubenswrapper[5116]: I0322 00:33:29.902010 5116 generic.go:358] "Generic (PLEG): container finished" podID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerID="9f47b38185c3654b673c989e5c9b0bc8649578fd9e8d08fe1fb50b7b03c3dd7c" exitCode=0 Mar 22 00:33:29 crc kubenswrapper[5116]: I0322 00:33:29.902140 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" event={"ID":"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849","Type":"ContainerDied","Data":"9f47b38185c3654b673c989e5c9b0bc8649578fd9e8d08fe1fb50b7b03c3dd7c"} Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.163068 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.215021 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle\") pod \"4fc23f04-b941-4ba1-877b-076f0f27569a\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.215114 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util\") pod \"4fc23f04-b941-4ba1-877b-076f0f27569a\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.215224 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjspz\" (UniqueName: \"kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz\") pod \"4fc23f04-b941-4ba1-877b-076f0f27569a\" (UID: \"4fc23f04-b941-4ba1-877b-076f0f27569a\") " Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.216783 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle" (OuterVolumeSpecName: "bundle") pod "4fc23f04-b941-4ba1-877b-076f0f27569a" (UID: "4fc23f04-b941-4ba1-877b-076f0f27569a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.223477 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz" (OuterVolumeSpecName: "kube-api-access-sjspz") pod "4fc23f04-b941-4ba1-877b-076f0f27569a" (UID: "4fc23f04-b941-4ba1-877b-076f0f27569a"). InnerVolumeSpecName "kube-api-access-sjspz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.227779 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util" (OuterVolumeSpecName: "util") pod "4fc23f04-b941-4ba1-877b-076f0f27569a" (UID: "4fc23f04-b941-4ba1-877b-076f0f27569a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.316781 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.316827 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4fc23f04-b941-4ba1-877b-076f0f27569a-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.316840 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sjspz\" (UniqueName: \"kubernetes.io/projected/4fc23f04-b941-4ba1-877b-076f0f27569a-kube-api-access-sjspz\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.914094 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" event={"ID":"4fc23f04-b941-4ba1-877b-076f0f27569a","Type":"ContainerDied","Data":"a87780d2b4bac827afdf7443e6daa4e225ba7f3f20729ab614d7fe7e816d18bf"} Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.914344 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a87780d2b4bac827afdf7443e6daa4e225ba7f3f20729ab614d7fe7e816d18bf" Mar 22 00:33:30 crc kubenswrapper[5116]: I0322 00:33:30.914119 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c097g4dq" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.172664 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.231452 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle\") pod \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.231647 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn226\" (UniqueName: \"kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226\") pod \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.231709 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util\") pod \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\" (UID: \"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849\") " Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.232392 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle" (OuterVolumeSpecName: "bundle") pod "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" (UID: "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.236492 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226" (OuterVolumeSpecName: "kube-api-access-dn226") pod "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" (UID: "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849"). InnerVolumeSpecName "kube-api-access-dn226". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.250328 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util" (OuterVolumeSpecName: "util") pod "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" (UID: "02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.333580 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dn226\" (UniqueName: \"kubernetes.io/projected/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-kube-api-access-dn226\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.333617 5116 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-util\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.333628 5116 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849-bundle\") on node \"crc\" DevicePath \"\"" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.923877 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" event={"ID":"02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849","Type":"ContainerDied","Data":"c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326"} Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.924267 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c25ede99fbecacb36309b5ab19ceeac97bc7d5bda3d415e16de1216927b55326" Mar 22 00:33:31 crc kubenswrapper[5116]: I0322 00:33:31.923915 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65ak4pph" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.410830 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-55b77595c-67kjz"] Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411871 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="util" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411886 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="util" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411900 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="util" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411905 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="util" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411912 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="pull" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411918 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="pull" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411936 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411941 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411957 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="pull" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411962 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="pull" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411978 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.411982 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.412066 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fc23f04-b941-4ba1-877b-076f0f27569a" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.412074 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="02e0ed20-34bb-4aa9-a4c8-89dcbd6c3849" containerName="extract" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.431655 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-55b77595c-67kjz"] Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.431806 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.435240 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"service-telemetry-operator-dockercfg-xzx4z\"" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.477320 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc8nj\" (UniqueName: \"kubernetes.io/projected/fcfe47c3-284e-4c8b-b35f-06a0499313a5-kube-api-access-hc8nj\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.477389 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/fcfe47c3-284e-4c8b-b35f-06a0499313a5-runner\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.578962 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc8nj\" (UniqueName: \"kubernetes.io/projected/fcfe47c3-284e-4c8b-b35f-06a0499313a5-kube-api-access-hc8nj\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.579017 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/fcfe47c3-284e-4c8b-b35f-06a0499313a5-runner\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.579447 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/fcfe47c3-284e-4c8b-b35f-06a0499313a5-runner\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.596254 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc8nj\" (UniqueName: \"kubernetes.io/projected/fcfe47c3-284e-4c8b-b35f-06a0499313a5-kube-api-access-hc8nj\") pod \"service-telemetry-operator-55b77595c-67kjz\" (UID: \"fcfe47c3-284e-4c8b-b35f-06a0499313a5\") " pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:34 crc kubenswrapper[5116]: I0322 00:33:34.774434 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.006417 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-55b77595c-67kjz"] Mar 22 00:33:35 crc kubenswrapper[5116]: W0322 00:33:35.024562 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcfe47c3_284e_4c8b_b35f_06a0499313a5.slice/crio-e711f3f65c1651ad702f4b50acc3651680881daaf5d35d0eb5ea82737d63871a WatchSource:0}: Error finding container e711f3f65c1651ad702f4b50acc3651680881daaf5d35d0eb5ea82737d63871a: Status 404 returned error can't find the container with id e711f3f65c1651ad702f4b50acc3651680881daaf5d35d0eb5ea82737d63871a Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.847528 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-564975b589-lf9hg"] Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.862260 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-564975b589-lf9hg"] Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.862392 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.865486 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"smart-gateway-operator-dockercfg-bh5jq\"" Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.963556 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" event={"ID":"fcfe47c3-284e-4c8b-b35f-06a0499313a5","Type":"ContainerStarted","Data":"e711f3f65c1651ad702f4b50acc3651680881daaf5d35d0eb5ea82737d63871a"} Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.996453 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5f8e2cff-6459-4aae-8cf6-a48587311f68-runner\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:35 crc kubenswrapper[5116]: I0322 00:33:35.996607 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw42l\" (UniqueName: \"kubernetes.io/projected/5f8e2cff-6459-4aae-8cf6-a48587311f68-kube-api-access-cw42l\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.099084 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5f8e2cff-6459-4aae-8cf6-a48587311f68-runner\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.099348 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw42l\" (UniqueName: \"kubernetes.io/projected/5f8e2cff-6459-4aae-8cf6-a48587311f68-kube-api-access-cw42l\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.100031 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5f8e2cff-6459-4aae-8cf6-a48587311f68-runner\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.143124 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw42l\" (UniqueName: \"kubernetes.io/projected/5f8e2cff-6459-4aae-8cf6-a48587311f68-kube-api-access-cw42l\") pod \"smart-gateway-operator-564975b589-lf9hg\" (UID: \"5f8e2cff-6459-4aae-8cf6-a48587311f68\") " pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.188650 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.615366 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-564975b589-lf9hg"] Mar 22 00:33:36 crc kubenswrapper[5116]: W0322 00:33:36.630613 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f8e2cff_6459_4aae_8cf6_a48587311f68.slice/crio-ddfdbba80b737e24522a2072d06990961fde6df9b9a306b06325a4922f7c5b24 WatchSource:0}: Error finding container ddfdbba80b737e24522a2072d06990961fde6df9b9a306b06325a4922f7c5b24: Status 404 returned error can't find the container with id ddfdbba80b737e24522a2072d06990961fde6df9b9a306b06325a4922f7c5b24 Mar 22 00:33:36 crc kubenswrapper[5116]: I0322 00:33:36.993266 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" event={"ID":"5f8e2cff-6459-4aae-8cf6-a48587311f68","Type":"ContainerStarted","Data":"ddfdbba80b737e24522a2072d06990961fde6df9b9a306b06325a4922f7c5b24"} Mar 22 00:33:53 crc kubenswrapper[5116]: I0322 00:33:53.056902 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:33:53 crc kubenswrapper[5116]: I0322 00:33:53.057253 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:33:56 crc kubenswrapper[5116]: I0322 00:33:56.113995 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:33:56 crc kubenswrapper[5116]: I0322 00:33:56.119841 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:33:56 crc kubenswrapper[5116]: I0322 00:33:56.134250 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:33:56 crc kubenswrapper[5116]: I0322 00:33:56.137508 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:33:57 crc kubenswrapper[5116]: I0322 00:33:57.161558 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" event={"ID":"fcfe47c3-284e-4c8b-b35f-06a0499313a5","Type":"ContainerStarted","Data":"dad31067a41f6fc3033d6f01d5d1b9e7b4f361746cc24e0263c7d437271c296e"} Mar 22 00:33:57 crc kubenswrapper[5116]: I0322 00:33:57.163933 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" event={"ID":"5f8e2cff-6459-4aae-8cf6-a48587311f68","Type":"ContainerStarted","Data":"0b6bffef25f9036758329187792f9e0fe15eaad0e6870e9c13b26f13baff06fa"} Mar 22 00:33:57 crc kubenswrapper[5116]: I0322 00:33:57.184763 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-55b77595c-67kjz" podStartSLOduration=2.090863636 podStartE2EDuration="23.1847485s" podCreationTimestamp="2026-03-22 00:33:34 +0000 UTC" firstStartedPulling="2026-03-22 00:33:35.027118425 +0000 UTC m=+1486.049419798" lastFinishedPulling="2026-03-22 00:33:56.121003299 +0000 UTC m=+1507.143304662" observedRunningTime="2026-03-22 00:33:57.183406448 +0000 UTC m=+1508.205707821" watchObservedRunningTime="2026-03-22 00:33:57.1847485 +0000 UTC m=+1508.207049873" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.153849 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-564975b589-lf9hg" podStartSLOduration=5.646590447 podStartE2EDuration="25.153819694s" podCreationTimestamp="2026-03-22 00:33:35 +0000 UTC" firstStartedPulling="2026-03-22 00:33:36.63257059 +0000 UTC m=+1487.654871973" lastFinishedPulling="2026-03-22 00:33:56.139799857 +0000 UTC m=+1507.162101220" observedRunningTime="2026-03-22 00:33:57.204906922 +0000 UTC m=+1508.227208325" watchObservedRunningTime="2026-03-22 00:34:00.153819694 +0000 UTC m=+1511.176121077" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.154824 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568994-tx27l"] Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.198953 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568994-tx27l"] Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.199110 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.201396 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.203240 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.203417 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.264449 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xkw\" (UniqueName: \"kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw\") pod \"auto-csr-approver-29568994-tx27l\" (UID: \"a7f4a688-3ca1-4538-9b16-323899848ec1\") " pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.366218 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xkw\" (UniqueName: \"kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw\") pod \"auto-csr-approver-29568994-tx27l\" (UID: \"a7f4a688-3ca1-4538-9b16-323899848ec1\") " pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.390801 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xkw\" (UniqueName: \"kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw\") pod \"auto-csr-approver-29568994-tx27l\" (UID: \"a7f4a688-3ca1-4538-9b16-323899848ec1\") " pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.516012 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:00 crc kubenswrapper[5116]: I0322 00:34:00.984033 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568994-tx27l"] Mar 22 00:34:00 crc kubenswrapper[5116]: W0322 00:34:00.987416 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7f4a688_3ca1_4538_9b16_323899848ec1.slice/crio-7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a WatchSource:0}: Error finding container 7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a: Status 404 returned error can't find the container with id 7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a Mar 22 00:34:01 crc kubenswrapper[5116]: I0322 00:34:01.196731 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568994-tx27l" event={"ID":"a7f4a688-3ca1-4538-9b16-323899848ec1","Type":"ContainerStarted","Data":"7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a"} Mar 22 00:34:02 crc kubenswrapper[5116]: I0322 00:34:02.207081 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568994-tx27l" event={"ID":"a7f4a688-3ca1-4538-9b16-323899848ec1","Type":"ContainerStarted","Data":"1109d2350dd4889e7482fa6160c73f85e35ef58464b38a92e5a8ce5b9d9fcea4"} Mar 22 00:34:02 crc kubenswrapper[5116]: I0322 00:34:02.227217 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29568994-tx27l" podStartSLOduration=1.30926425 podStartE2EDuration="2.227196772s" podCreationTimestamp="2026-03-22 00:34:00 +0000 UTC" firstStartedPulling="2026-03-22 00:34:00.989738748 +0000 UTC m=+1512.012040121" lastFinishedPulling="2026-03-22 00:34:01.90767127 +0000 UTC m=+1512.929972643" observedRunningTime="2026-03-22 00:34:02.225074066 +0000 UTC m=+1513.247375449" watchObservedRunningTime="2026-03-22 00:34:02.227196772 +0000 UTC m=+1513.249498155" Mar 22 00:34:03 crc kubenswrapper[5116]: I0322 00:34:03.217111 5116 generic.go:358] "Generic (PLEG): container finished" podID="a7f4a688-3ca1-4538-9b16-323899848ec1" containerID="1109d2350dd4889e7482fa6160c73f85e35ef58464b38a92e5a8ce5b9d9fcea4" exitCode=0 Mar 22 00:34:03 crc kubenswrapper[5116]: I0322 00:34:03.217229 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568994-tx27l" event={"ID":"a7f4a688-3ca1-4538-9b16-323899848ec1","Type":"ContainerDied","Data":"1109d2350dd4889e7482fa6160c73f85e35ef58464b38a92e5a8ce5b9d9fcea4"} Mar 22 00:34:04 crc kubenswrapper[5116]: I0322 00:34:04.568925 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:04 crc kubenswrapper[5116]: I0322 00:34:04.736486 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5xkw\" (UniqueName: \"kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw\") pod \"a7f4a688-3ca1-4538-9b16-323899848ec1\" (UID: \"a7f4a688-3ca1-4538-9b16-323899848ec1\") " Mar 22 00:34:04 crc kubenswrapper[5116]: I0322 00:34:04.743050 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw" (OuterVolumeSpecName: "kube-api-access-r5xkw") pod "a7f4a688-3ca1-4538-9b16-323899848ec1" (UID: "a7f4a688-3ca1-4538-9b16-323899848ec1"). InnerVolumeSpecName "kube-api-access-r5xkw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:34:04 crc kubenswrapper[5116]: I0322 00:34:04.839568 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5xkw\" (UniqueName: \"kubernetes.io/projected/a7f4a688-3ca1-4538-9b16-323899848ec1-kube-api-access-r5xkw\") on node \"crc\" DevicePath \"\"" Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.241021 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568994-tx27l" event={"ID":"a7f4a688-3ca1-4538-9b16-323899848ec1","Type":"ContainerDied","Data":"7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a"} Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.241076 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ee80db9b009c402d8f94909edc7814c40d12193ccaf7f41dc54a94dd06f1a7a" Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.241111 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568994-tx27l" Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.286402 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568988-gbkp8"] Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.305686 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568988-gbkp8"] Mar 22 00:34:05 crc kubenswrapper[5116]: I0322 00:34:05.713015 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="135f0dfe-78e2-4264-ae7b-7d6b95ebbb39" path="/var/lib/kubelet/pods/135f0dfe-78e2-4264-ae7b-7d6b95ebbb39/volumes" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.959208 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.960504 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a7f4a688-3ca1-4538-9b16-323899848ec1" containerName="oc" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.960523 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f4a688-3ca1-4538-9b16-323899848ec1" containerName="oc" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.960675 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="a7f4a688-3ca1-4538-9b16-323899848ec1" containerName="oc" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.973560 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.978634 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-openstack-credentials\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.978983 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-users\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.979024 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-inter-router-ca\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.979135 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-dockercfg-cm89f\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.979185 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-interconnect-sasl-config\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.979392 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-openstack-ca\"" Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.980371 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:34:21 crc kubenswrapper[5116]: I0322 00:34:21.982377 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-inter-router-credentials\"" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.096986 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097191 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097331 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097388 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097408 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssw5p\" (UniqueName: \"kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097463 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.097543 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199080 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199236 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199289 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199388 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199444 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199502 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.199536 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssw5p\" (UniqueName: \"kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.203885 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.209050 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.209275 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.210140 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.210267 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.228407 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.234768 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssw5p\" (UniqueName: \"kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p\") pod \"default-interconnect-55bf8d5cb-r7j5z\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.296204 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:34:22 crc kubenswrapper[5116]: I0322 00:34:22.656731 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.057144 5116 patch_prober.go:28] interesting pod/machine-config-daemon-66g6d container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.057289 5116 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.057383 5116 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.058309 5116 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf"} pod="openshift-machine-config-operator/machine-config-daemon-66g6d" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.058415 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerName="machine-config-daemon" containerID="cri-o://d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" gracePeriod=600 Mar 22 00:34:23 crc kubenswrapper[5116]: E0322 00:34:23.235694 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.431674 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" event={"ID":"e29f610d-8ef1-4992-857d-7b39f8694e44","Type":"ContainerStarted","Data":"bf3991c3ece1f301b0d1fccfff351a4b6d180bae7f391fffa01dbd097f35a02d"} Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.436272 5116 generic.go:358] "Generic (PLEG): container finished" podID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" exitCode=0 Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.436475 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerDied","Data":"d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf"} Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.436534 5116 scope.go:117] "RemoveContainer" containerID="49fe123552b6212becb6925edd45f7eeb8e4b240ca3f762ba113af9cc73657f3" Mar 22 00:34:23 crc kubenswrapper[5116]: I0322 00:34:23.437439 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:34:23 crc kubenswrapper[5116]: E0322 00:34:23.437897 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:34:28 crc kubenswrapper[5116]: I0322 00:34:28.475281 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" event={"ID":"e29f610d-8ef1-4992-857d-7b39f8694e44","Type":"ContainerStarted","Data":"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e"} Mar 22 00:34:28 crc kubenswrapper[5116]: I0322 00:34:28.501331 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" podStartSLOduration=2.548602045 podStartE2EDuration="7.501305245s" podCreationTimestamp="2026-03-22 00:34:21 +0000 UTC" firstStartedPulling="2026-03-22 00:34:22.660007231 +0000 UTC m=+1533.682308604" lastFinishedPulling="2026-03-22 00:34:27.612710431 +0000 UTC m=+1538.635011804" observedRunningTime="2026-03-22 00:34:28.493527732 +0000 UTC m=+1539.515829105" watchObservedRunningTime="2026-03-22 00:34:28.501305245 +0000 UTC m=+1539.523606638" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.082719 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.096589 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.102322 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"serving-certs-ca-bundle\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.102522 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"prometheus-default-tls-assets-0\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.102337 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"prometheus-default\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.106402 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"prometheus-stf-dockercfg-9zlsj\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.106739 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-default-rulefiles-2\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.107659 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-prometheus-proxy-tls\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.107739 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-default-rulefiles-0\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.107779 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"prometheus-default-rulefiles-1\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.107814 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-session-secret\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.108028 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"prometheus-default-web-config\"" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.118858 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256205 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-tls-assets\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256270 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256421 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256594 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e34b9271-e06c-4d13-9536-482afafbee1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e34b9271-e06c-4d13-9536-482afafbee1a\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256648 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-kube-api-access-68cff\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256743 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256874 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.256977 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd2bf44b-01e8-4236-9dda-90998dd75b88-config-out\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.257019 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.257060 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.257083 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-web-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.257125 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358605 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd2bf44b-01e8-4236-9dda-90998dd75b88-config-out\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358667 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358698 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358720 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-web-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358745 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358855 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-tls-assets\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358874 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358902 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358946 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-e34b9271-e06c-4d13-9536-482afafbee1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e34b9271-e06c-4d13-9536-482afafbee1a\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.358969 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-kube-api-access-68cff\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.359007 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.359040 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.360265 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.361260 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: E0322 00:34:32.361336 5116 secret.go:189] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 22 00:34:32 crc kubenswrapper[5116]: E0322 00:34:32.361398 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls podName:bd2bf44b-01e8-4236-9dda-90998dd75b88 nodeName:}" failed. No retries permitted until 2026-03-22 00:34:32.861380517 +0000 UTC m=+1543.883681900 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "bd2bf44b-01e8-4236-9dda-90998dd75b88") : secret "default-prometheus-proxy-tls" not found Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.361866 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.362342 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bd2bf44b-01e8-4236-9dda-90998dd75b88-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.367373 5116 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.367447 5116 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-e34b9271-e06c-4d13-9536-482afafbee1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e34b9271-e06c-4d13-9536-482afafbee1a\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/68142e33650e2a4f12e38c949b077fedb6bc181702745383b76390572000844b/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.367801 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.369576 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd2bf44b-01e8-4236-9dda-90998dd75b88-config-out\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.370475 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.378988 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-tls-assets\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.387545 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-web-config\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.390637 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68cff\" (UniqueName: \"kubernetes.io/projected/bd2bf44b-01e8-4236-9dda-90998dd75b88-kube-api-access-68cff\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.422307 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-e34b9271-e06c-4d13-9536-482afafbee1a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e34b9271-e06c-4d13-9536-482afafbee1a\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: I0322 00:34:32.866624 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:32 crc kubenswrapper[5116]: E0322 00:34:32.868809 5116 secret.go:189] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 22 00:34:32 crc kubenswrapper[5116]: E0322 00:34:32.868895 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls podName:bd2bf44b-01e8-4236-9dda-90998dd75b88 nodeName:}" failed. No retries permitted until 2026-03-22 00:34:33.86887372 +0000 UTC m=+1544.891175133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "bd2bf44b-01e8-4236-9dda-90998dd75b88") : secret "default-prometheus-proxy-tls" not found Mar 22 00:34:33 crc kubenswrapper[5116]: I0322 00:34:33.882152 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:33 crc kubenswrapper[5116]: I0322 00:34:33.888611 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/bd2bf44b-01e8-4236-9dda-90998dd75b88-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"bd2bf44b-01e8-4236-9dda-90998dd75b88\") " pod="service-telemetry/prometheus-default-0" Mar 22 00:34:33 crc kubenswrapper[5116]: I0322 00:34:33.927828 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 22 00:34:34 crc kubenswrapper[5116]: I0322 00:34:34.220834 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 22 00:34:34 crc kubenswrapper[5116]: W0322 00:34:34.224549 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd2bf44b_01e8_4236_9dda_90998dd75b88.slice/crio-8838301c1591fe5becdaadf4a2219f431f724bb4598fd181f625fe2fb932704e WatchSource:0}: Error finding container 8838301c1591fe5becdaadf4a2219f431f724bb4598fd181f625fe2fb932704e: Status 404 returned error can't find the container with id 8838301c1591fe5becdaadf4a2219f431f724bb4598fd181f625fe2fb932704e Mar 22 00:34:34 crc kubenswrapper[5116]: I0322 00:34:34.525940 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerStarted","Data":"8838301c1591fe5becdaadf4a2219f431f724bb4598fd181f625fe2fb932704e"} Mar 22 00:34:36 crc kubenswrapper[5116]: I0322 00:34:36.698015 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:34:36 crc kubenswrapper[5116]: E0322 00:34:36.699079 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:34:38 crc kubenswrapper[5116]: I0322 00:34:38.580804 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerStarted","Data":"927e86438d6ed8377c017c761d46dcc9030f2456729b3c91c60ced84d2b01f15"} Mar 22 00:34:41 crc kubenswrapper[5116]: I0322 00:34:41.860666 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-694dc457d5-77wwx"] Mar 22 00:34:41 crc kubenswrapper[5116]: I0322 00:34:41.871940 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" Mar 22 00:34:41 crc kubenswrapper[5116]: I0322 00:34:41.880266 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-694dc457d5-77wwx"] Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.025635 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dpnx\" (UniqueName: \"kubernetes.io/projected/536dbd5d-337e-43fa-925a-e88d3be7da06-kube-api-access-8dpnx\") pod \"default-snmp-webhook-694dc457d5-77wwx\" (UID: \"536dbd5d-337e-43fa-925a-e88d3be7da06\") " pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.127158 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dpnx\" (UniqueName: \"kubernetes.io/projected/536dbd5d-337e-43fa-925a-e88d3be7da06-kube-api-access-8dpnx\") pod \"default-snmp-webhook-694dc457d5-77wwx\" (UID: \"536dbd5d-337e-43fa-925a-e88d3be7da06\") " pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.155674 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dpnx\" (UniqueName: \"kubernetes.io/projected/536dbd5d-337e-43fa-925a-e88d3be7da06-kube-api-access-8dpnx\") pod \"default-snmp-webhook-694dc457d5-77wwx\" (UID: \"536dbd5d-337e-43fa-925a-e88d3be7da06\") " pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.199657 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.463049 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-694dc457d5-77wwx"] Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.477690 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:34:42 crc kubenswrapper[5116]: I0322 00:34:42.618059 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" event={"ID":"536dbd5d-337e-43fa-925a-e88d3be7da06","Type":"ContainerStarted","Data":"21df424f5a1103eb9542e3aa79b2d9a068b666580c132c116d2be44b12c91aff"} Mar 22 00:34:44 crc kubenswrapper[5116]: I0322 00:34:44.632991 5116 generic.go:358] "Generic (PLEG): container finished" podID="bd2bf44b-01e8-4236-9dda-90998dd75b88" containerID="927e86438d6ed8377c017c761d46dcc9030f2456729b3c91c60ced84d2b01f15" exitCode=0 Mar 22 00:34:44 crc kubenswrapper[5116]: I0322 00:34:44.633194 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerDied","Data":"927e86438d6ed8377c017c761d46dcc9030f2456729b3c91c60ced84d2b01f15"} Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.341013 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.380115 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.380318 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.383721 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-alertmanager-proxy-tls\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.384103 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"alertmanager-default-cluster-tls-config\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.384317 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"alertmanager-stf-dockercfg-b5vhx\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.384450 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"alertmanager-default-tls-assets-0\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.385015 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"alertmanager-default-generated\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.389398 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"alertmanager-default-web-config\"" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503597 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-volume\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503658 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503694 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2bt6\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-kube-api-access-c2bt6\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503726 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503762 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503785 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-out\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503800 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503821 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-adb471bd-3392-4e99-8064-a0c701f964f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-adb471bd-3392-4e99-8064-a0c701f964f8\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.503836 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-web-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605213 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605297 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605333 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-out\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605358 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605386 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-adb471bd-3392-4e99-8064-a0c701f964f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-adb471bd-3392-4e99-8064-a0c701f964f8\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605413 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-web-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: E0322 00:34:45.605510 5116 secret.go:189] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:45 crc kubenswrapper[5116]: E0322 00:34:45.605607 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls podName:6a4e1e85-870e-408e-a4dd-c5a7d7fcecae nodeName:}" failed. No retries permitted until 2026-03-22 00:34:46.105583246 +0000 UTC m=+1557.127884629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "6a4e1e85-870e-408e-a4dd-c5a7d7fcecae") : secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.605440 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-volume\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.606459 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.606515 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2bt6\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-kube-api-access-c2bt6\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.610187 5116 csi_attacher.go:373] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.612137 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-out\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.612625 5116 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-adb471bd-3392-4e99-8064-a0c701f964f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-adb471bd-3392-4e99-8064-a0c701f964f8\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1e6735ba0855894953e997216cd431bb4709665fadbca3e3e6ab15f305ef49eb/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.614062 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-web-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.615014 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.615822 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-config-volume\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.620213 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-tls-assets\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.623412 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.628273 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2bt6\" (UniqueName: \"kubernetes.io/projected/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-kube-api-access-c2bt6\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:45 crc kubenswrapper[5116]: I0322 00:34:45.641390 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-adb471bd-3392-4e99-8064-a0c701f964f8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-adb471bd-3392-4e99-8064-a0c701f964f8\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:46 crc kubenswrapper[5116]: I0322 00:34:46.113146 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:46 crc kubenswrapper[5116]: E0322 00:34:46.113315 5116 secret.go:189] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:46 crc kubenswrapper[5116]: E0322 00:34:46.113517 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls podName:6a4e1e85-870e-408e-a4dd-c5a7d7fcecae nodeName:}" failed. No retries permitted until 2026-03-22 00:34:47.113492141 +0000 UTC m=+1558.135793534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "6a4e1e85-870e-408e-a4dd-c5a7d7fcecae") : secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:47 crc kubenswrapper[5116]: I0322 00:34:47.129062 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:47 crc kubenswrapper[5116]: E0322 00:34:47.129279 5116 secret.go:189] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:47 crc kubenswrapper[5116]: E0322 00:34:47.129364 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls podName:6a4e1e85-870e-408e-a4dd-c5a7d7fcecae nodeName:}" failed. No retries permitted until 2026-03-22 00:34:49.129346682 +0000 UTC m=+1560.151648055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "6a4e1e85-870e-408e-a4dd-c5a7d7fcecae") : secret "default-alertmanager-proxy-tls" not found Mar 22 00:34:49 crc kubenswrapper[5116]: I0322 00:34:49.158803 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:49 crc kubenswrapper[5116]: I0322 00:34:49.166055 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a4e1e85-870e-408e-a4dd-c5a7d7fcecae-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae\") " pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:49 crc kubenswrapper[5116]: I0322 00:34:49.304068 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 22 00:34:51 crc kubenswrapper[5116]: I0322 00:34:51.033771 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 22 00:34:51 crc kubenswrapper[5116]: I0322 00:34:51.707752 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" event={"ID":"536dbd5d-337e-43fa-925a-e88d3be7da06","Type":"ContainerStarted","Data":"631acff58a9da0052a575a8d9aa4f4ea970f2b6070611466cd8f2e9d2f095c3c"} Mar 22 00:34:51 crc kubenswrapper[5116]: I0322 00:34:51.707927 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:34:51 crc kubenswrapper[5116]: E0322 00:34:51.708282 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:34:51 crc kubenswrapper[5116]: I0322 00:34:51.710965 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerStarted","Data":"944a4f90bd33711769e78227464fa7814b61bddfb0732e2e2218d5afcadf8d6a"} Mar 22 00:34:51 crc kubenswrapper[5116]: I0322 00:34:51.726701 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-694dc457d5-77wwx" podStartSLOduration=2.550659586 podStartE2EDuration="10.726680097s" podCreationTimestamp="2026-03-22 00:34:41 +0000 UTC" firstStartedPulling="2026-03-22 00:34:42.47785989 +0000 UTC m=+1553.500161263" lastFinishedPulling="2026-03-22 00:34:50.653880401 +0000 UTC m=+1561.676181774" observedRunningTime="2026-03-22 00:34:51.721868745 +0000 UTC m=+1562.744170128" watchObservedRunningTime="2026-03-22 00:34:51.726680097 +0000 UTC m=+1562.748981490" Mar 22 00:34:53 crc kubenswrapper[5116]: I0322 00:34:53.729130 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerStarted","Data":"2b1982fd22c4e01709284329d1904d970fb21c60fb9936f1b710c1c5e9e998af"} Mar 22 00:34:54 crc kubenswrapper[5116]: I0322 00:34:54.647374 5116 scope.go:117] "RemoveContainer" containerID="6aa0210430a18d2b96a0bdd3c189fd69455e4afb5753bc588ac349572da2555d" Mar 22 00:34:55 crc kubenswrapper[5116]: I0322 00:34:55.745662 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerStarted","Data":"f7efbe447e97065fd04b575590d695c99b1492f6ba1e95418a796d203ccfeeae"} Mar 22 00:34:57 crc kubenswrapper[5116]: I0322 00:34:57.760609 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerStarted","Data":"4a0fda02ce06bb91956db76ab16c54bf2e872d2e728df77278064c8c47c3417b"} Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.696680 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l"] Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.778953 5116 generic.go:358] "Generic (PLEG): container finished" podID="6a4e1e85-870e-408e-a4dd-c5a7d7fcecae" containerID="2b1982fd22c4e01709284329d1904d970fb21c60fb9936f1b710c1c5e9e998af" exitCode=0 Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.956276 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerDied","Data":"2b1982fd22c4e01709284329d1904d970fb21c60fb9936f1b710c1c5e9e998af"} Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.956425 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.956470 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l"] Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.959512 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-cloud1-coll-meter-sg-core-configmap\"" Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.959800 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-cloud1-coll-meter-proxy-tls\"" Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.959959 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"smart-gateway-session-secret\"" Mar 22 00:34:58 crc kubenswrapper[5116]: I0322 00:34:58.961723 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"smart-gateway-dockercfg-bd2n9\"" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.120410 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.120510 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.120563 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.120602 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.120625 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpjjd\" (UniqueName: \"kubernetes.io/projected/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-kube-api-access-cpjjd\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.222401 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.222481 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.222524 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.222554 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.222577 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpjjd\" (UniqueName: \"kubernetes.io/projected/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-kube-api-access-cpjjd\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: E0322 00:34:59.223070 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 22 00:34:59 crc kubenswrapper[5116]: E0322 00:34:59.223140 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls podName:8e2f0d9c-7207-4164-a27f-9efeba6e22bb nodeName:}" failed. No retries permitted until 2026-03-22 00:34:59.723121943 +0000 UTC m=+1570.745423316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" (UID: "8e2f0d9c-7207-4164-a27f-9efeba6e22bb") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.223833 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.224482 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.234669 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.242473 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpjjd\" (UniqueName: \"kubernetes.io/projected/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-kube-api-access-cpjjd\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: I0322 00:34:59.729863 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:34:59 crc kubenswrapper[5116]: E0322 00:34:59.730032 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 22 00:34:59 crc kubenswrapper[5116]: E0322 00:34:59.730126 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls podName:8e2f0d9c-7207-4164-a27f-9efeba6e22bb nodeName:}" failed. No retries permitted until 2026-03-22 00:35:00.730102479 +0000 UTC m=+1571.752403872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" (UID: "8e2f0d9c-7207-4164-a27f-9efeba6e22bb") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.552254 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.597192 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.597319 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.743080 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.743188 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsvs4\" (UniqueName: \"kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.743254 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.743321 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.797440 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/8e2f0d9c-7207-4164-a27f-9efeba6e22bb-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l\" (UID: \"8e2f0d9c-7207-4164-a27f-9efeba6e22bb\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.844028 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.844123 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.844220 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsvs4\" (UniqueName: \"kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.844598 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.844953 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.861810 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsvs4\" (UniqueName: \"kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4\") pod \"community-operators-xvfm2\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:00 crc kubenswrapper[5116]: I0322 00:35:00.930383 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.086556 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.401878 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q"] Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.426394 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q"] Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.426531 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.429673 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-cloud1-ceil-meter-proxy-tls\"" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.430872 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-cloud1-ceil-meter-sg-core-configmap\"" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.552433 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.552691 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.552809 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.552902 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m6fw\" (UniqueName: \"kubernetes.io/projected/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-kube-api-access-6m6fw\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.552947 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.654082 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.654179 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.654237 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6m6fw\" (UniqueName: \"kubernetes.io/projected/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-kube-api-access-6m6fw\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.654265 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.654353 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: E0322 00:35:01.654482 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 22 00:35:01 crc kubenswrapper[5116]: E0322 00:35:01.654547 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls podName:c2a84a49-cbcf-41dd-8ed2-1df6cd7db259 nodeName:}" failed. No retries permitted until 2026-03-22 00:35:02.15452725 +0000 UTC m=+1573.176828633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" (UID: "c2a84a49-cbcf-41dd-8ed2-1df6cd7db259") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.656117 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.658021 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.660864 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:01 crc kubenswrapper[5116]: I0322 00:35:01.670996 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m6fw\" (UniqueName: \"kubernetes.io/projected/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-kube-api-access-6m6fw\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:02 crc kubenswrapper[5116]: I0322 00:35:02.160847 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:02 crc kubenswrapper[5116]: E0322 00:35:02.160996 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 22 00:35:02 crc kubenswrapper[5116]: E0322 00:35:02.161058 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls podName:c2a84a49-cbcf-41dd-8ed2-1df6cd7db259 nodeName:}" failed. No retries permitted until 2026-03-22 00:35:03.161041811 +0000 UTC m=+1574.183343184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" (UID: "c2a84a49-cbcf-41dd-8ed2-1df6cd7db259") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 22 00:35:03 crc kubenswrapper[5116]: I0322 00:35:03.174182 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:03 crc kubenswrapper[5116]: I0322 00:35:03.188405 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a84a49-cbcf-41dd-8ed2-1df6cd7db259-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q\" (UID: \"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:03 crc kubenswrapper[5116]: I0322 00:35:03.245287 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.763293 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l"] Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.844443 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.853595 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q"] Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.935813 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5"] Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.944869 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5"] Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.945073 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.947215 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-cloud1-sens-meter-proxy-tls\"" Mar 22 00:35:04 crc kubenswrapper[5116]: I0322 00:35:04.947836 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-cloud1-sens-meter-sg-core-configmap\"" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.025209 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.025334 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d6d17f0-5973-421e-9838-1a6195ca1731-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.025367 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tb5p\" (UniqueName: \"kubernetes.io/projected/1d6d17f0-5973-421e-9838-1a6195ca1731-kube-api-access-8tb5p\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.025390 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d6d17f0-5973-421e-9838-1a6195ca1731-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.025501 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.126531 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d6d17f0-5973-421e-9838-1a6195ca1731-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.126583 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tb5p\" (UniqueName: \"kubernetes.io/projected/1d6d17f0-5973-421e-9838-1a6195ca1731-kube-api-access-8tb5p\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.126613 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d6d17f0-5973-421e-9838-1a6195ca1731-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.126659 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.126701 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: E0322 00:35:05.126868 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.127061 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d6d17f0-5973-421e-9838-1a6195ca1731-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: E0322 00:35:05.127143 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls podName:1d6d17f0-5973-421e-9838-1a6195ca1731 nodeName:}" failed. No retries permitted until 2026-03-22 00:35:05.626959406 +0000 UTC m=+1576.649260779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" (UID: "1d6d17f0-5973-421e-9838-1a6195ca1731") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.127412 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d6d17f0-5973-421e-9838-1a6195ca1731-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.150138 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tb5p\" (UniqueName: \"kubernetes.io/projected/1d6d17f0-5973-421e-9838-1a6195ca1731-kube-api-access-8tb5p\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.153806 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.635465 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:05 crc kubenswrapper[5116]: E0322 00:35:05.635829 5116 secret.go:189] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 22 00:35:05 crc kubenswrapper[5116]: E0322 00:35:05.635904 5116 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls podName:1d6d17f0-5973-421e-9838-1a6195ca1731 nodeName:}" failed. No retries permitted until 2026-03-22 00:35:06.635883592 +0000 UTC m=+1577.658184965 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" (UID: "1d6d17f0-5973-421e-9838-1a6195ca1731") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.849977 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"73af41f1b34db2aa72ddcf2cdfb268e5f5d448440f5f21782adb05c23ebbc7bd"} Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.851302 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerStarted","Data":"7100f58fc42898dd88286865a83e861cb4a7d901326e4759acb51b3639cfe743"} Mar 22 00:35:05 crc kubenswrapper[5116]: I0322 00:35:05.853451 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"2c957c4c3220bf67de939876cc1681bfa89a50c3e0747dd74d1db24c799c2691"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.651590 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.678947 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1d6d17f0-5973-421e-9838-1a6195ca1731-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5\" (UID: \"1d6d17f0-5973-421e-9838-1a6195ca1731\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.697670 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:35:06 crc kubenswrapper[5116]: E0322 00:35:06.697983 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.787103 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.863016 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"0582d2cf2a83fbb70d9e4f9ae0e4a04780481473ce5cdeec5fe7a1f24097d7c6"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.865416 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"bd2bf44b-01e8-4236-9dda-90998dd75b88","Type":"ContainerStarted","Data":"aca9f6bcf759d14aba4e46449177bb7093c935bd7505cb25d6a81b43f1f882e0"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.868872 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerStarted","Data":"03c3b2e14062911452089c7edc2b2b8d00cd4d1ff77394dfb91ac45933c37e35"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.870553 5116 generic.go:358] "Generic (PLEG): container finished" podID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerID="dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9" exitCode=0 Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.870677 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerDied","Data":"dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.873093 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"40c129001d4188023f7270fc28d50f2540d93840c59fe23071464b4a0b6fa30d"} Mar 22 00:35:06 crc kubenswrapper[5116]: I0322 00:35:06.889376 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=5.204008863 podStartE2EDuration="35.88935983s" podCreationTimestamp="2026-03-22 00:34:31 +0000 UTC" firstStartedPulling="2026-03-22 00:34:34.227360926 +0000 UTC m=+1545.249662309" lastFinishedPulling="2026-03-22 00:35:04.912711903 +0000 UTC m=+1575.935013276" observedRunningTime="2026-03-22 00:35:06.886085717 +0000 UTC m=+1577.908387090" watchObservedRunningTime="2026-03-22 00:35:06.88935983 +0000 UTC m=+1577.911661203" Mar 22 00:35:07 crc kubenswrapper[5116]: I0322 00:35:07.611845 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5"] Mar 22 00:35:07 crc kubenswrapper[5116]: I0322 00:35:07.887824 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerStarted","Data":"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73"} Mar 22 00:35:07 crc kubenswrapper[5116]: I0322 00:35:07.893722 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"2cf5090c5ad46f07d8a3dede9837b9ea2d6b6ac8502c78847377024c2bce79c0"} Mar 22 00:35:07 crc kubenswrapper[5116]: I0322 00:35:07.897336 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e"} Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.904295 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"5673e97709ad82226f5fb15650c9f69362db1f34846064e348aa9875834ac86e"} Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.907523 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerStarted","Data":"df926cb85cf0c0aacf22badcb179b743b5a8a9906e6b9415868832aa05028a7c"} Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.915367 5116 generic.go:358] "Generic (PLEG): container finished" podID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerID="c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73" exitCode=0 Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.915568 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerDied","Data":"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73"} Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.918313 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042"} Mar 22 00:35:08 crc kubenswrapper[5116]: I0322 00:35:08.928243 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="service-telemetry/prometheus-default-0" Mar 22 00:35:09 crc kubenswrapper[5116]: I0322 00:35:09.928278 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerStarted","Data":"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d"} Mar 22 00:35:10 crc kubenswrapper[5116]: I0322 00:35:10.956715 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xvfm2" podStartSLOduration=10.280415915 podStartE2EDuration="10.956692367s" podCreationTimestamp="2026-03-22 00:35:00 +0000 UTC" firstStartedPulling="2026-03-22 00:35:06.871110377 +0000 UTC m=+1577.893411760" lastFinishedPulling="2026-03-22 00:35:07.547386839 +0000 UTC m=+1578.569688212" observedRunningTime="2026-03-22 00:35:10.956382237 +0000 UTC m=+1581.978683630" watchObservedRunningTime="2026-03-22 00:35:10.956692367 +0000 UTC m=+1581.978993740" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.004305 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp"] Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.074327 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp"] Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.074517 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.076612 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"elasticsearch-es-cert\"" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.076686 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-cloud1-coll-event-sg-core-configmap\"" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.152125 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f826e257-238f-4715-8010-f30569577292-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.152205 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f826e257-238f-4715-8010-f30569577292-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.152350 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb6m8\" (UniqueName: \"kubernetes.io/projected/f826e257-238f-4715-8010-f30569577292-kube-api-access-wb6m8\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.152551 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f826e257-238f-4715-8010-f30569577292-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.253732 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f826e257-238f-4715-8010-f30569577292-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.253776 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f826e257-238f-4715-8010-f30569577292-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.253810 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb6m8\" (UniqueName: \"kubernetes.io/projected/f826e257-238f-4715-8010-f30569577292-kube-api-access-wb6m8\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.253868 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f826e257-238f-4715-8010-f30569577292-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.254377 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f826e257-238f-4715-8010-f30569577292-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.254767 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f826e257-238f-4715-8010-f30569577292-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.274952 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb6m8\" (UniqueName: \"kubernetes.io/projected/f826e257-238f-4715-8010-f30569577292-kube-api-access-wb6m8\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.274975 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f826e257-238f-4715-8010-f30569577292-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-664479c99b-v2bfp\" (UID: \"f826e257-238f-4715-8010-f30569577292\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.390599 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.667100 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6"] Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.771368 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.774499 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"default-cloud1-ceil-event-sg-core-configmap\"" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.778711 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkdvr\" (UniqueName: \"kubernetes.io/projected/874ff775-e791-4357-9719-a773b3a8e4d8-kube-api-access-xkdvr\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.778784 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/874ff775-e791-4357-9719-a773b3a8e4d8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.778876 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/874ff775-e791-4357-9719-a773b3a8e4d8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.779126 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/874ff775-e791-4357-9719-a773b3a8e4d8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.779704 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6"] Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.881508 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkdvr\" (UniqueName: \"kubernetes.io/projected/874ff775-e791-4357-9719-a773b3a8e4d8-kube-api-access-xkdvr\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.881638 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/874ff775-e791-4357-9719-a773b3a8e4d8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.881855 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/874ff775-e791-4357-9719-a773b3a8e4d8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.882129 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/874ff775-e791-4357-9719-a773b3a8e4d8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.882642 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/874ff775-e791-4357-9719-a773b3a8e4d8-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.882722 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/874ff775-e791-4357-9719-a773b3a8e4d8-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.892058 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/874ff775-e791-4357-9719-a773b3a8e4d8-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.909431 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp"] Mar 22 00:35:13 crc kubenswrapper[5116]: W0322 00:35:13.913539 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf826e257_238f_4715_8010_f30569577292.slice/crio-c1ee8d7ee2913c76cfec4150a69db340cbf3c5180e1d2ae91c31b92ecf79fe8c WatchSource:0}: Error finding container c1ee8d7ee2913c76cfec4150a69db340cbf3c5180e1d2ae91c31b92ecf79fe8c: Status 404 returned error can't find the container with id c1ee8d7ee2913c76cfec4150a69db340cbf3c5180e1d2ae91c31b92ecf79fe8c Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.915005 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkdvr\" (UniqueName: \"kubernetes.io/projected/874ff775-e791-4357-9719-a773b3a8e4d8-kube-api-access-xkdvr\") pod \"default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6\" (UID: \"874ff775-e791-4357-9719-a773b3a8e4d8\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:13 crc kubenswrapper[5116]: I0322 00:35:13.956611 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerStarted","Data":"c1ee8d7ee2913c76cfec4150a69db340cbf3c5180e1d2ae91c31b92ecf79fe8c"} Mar 22 00:35:14 crc kubenswrapper[5116]: I0322 00:35:14.090028 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" Mar 22 00:35:18 crc kubenswrapper[5116]: I0322 00:35:18.929503 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 22 00:35:18 crc kubenswrapper[5116]: I0322 00:35:18.973786 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 22 00:35:19 crc kubenswrapper[5116]: I0322 00:35:19.030911 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 22 00:35:19 crc kubenswrapper[5116]: I0322 00:35:19.736104 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:35:19 crc kubenswrapper[5116]: E0322 00:35:19.737066 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:35:20 crc kubenswrapper[5116]: I0322 00:35:20.004092 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6"] Mar 22 00:35:20 crc kubenswrapper[5116]: I0322 00:35:20.005720 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"f1ed95bed5bc469f431a016c5abaeb431a702448728bbeb1b68788f3fcb5b440"} Mar 22 00:35:20 crc kubenswrapper[5116]: W0322 00:35:20.035974 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod874ff775_e791_4357_9719_a773b3a8e4d8.slice/crio-7cb09084032966ba6ef0160d72dd6b6cb95dc418dbf4acc1daa8c2f64ea51b18 WatchSource:0}: Error finding container 7cb09084032966ba6ef0160d72dd6b6cb95dc418dbf4acc1daa8c2f64ea51b18: Status 404 returned error can't find the container with id 7cb09084032966ba6ef0160d72dd6b6cb95dc418dbf4acc1daa8c2f64ea51b18 Mar 22 00:35:20 crc kubenswrapper[5116]: I0322 00:35:20.931542 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:20 crc kubenswrapper[5116]: I0322 00:35:20.933318 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:20 crc kubenswrapper[5116]: I0322 00:35:20.994053 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.016025 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"6a4e1e85-870e-408e-a4dd-c5a7d7fcecae","Type":"ContainerStarted","Data":"db2de75efae1f99672ef051de2d624c33479b2410198cea11e989e235ebdd662"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.018791 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"45969a2287bcba0858b98439f73dd9672c718889ae381f10574f35d1bdcd5701"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.022463 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerStarted","Data":"34a142d14fdb70f283397702f4a0e3569bc2c7c63099ed71bd5d66c03fb31242"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.022507 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerStarted","Data":"bdc01262a2be3743c259f8f0138badb55dc7faf715baa5327a31fb3866566618"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.022520 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerStarted","Data":"7cb09084032966ba6ef0160d72dd6b6cb95dc418dbf4acc1daa8c2f64ea51b18"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.025107 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerStarted","Data":"be22b1c1d5c7c321fa354ab59ee442ff30a573bafc539073c1dc06c846b02fa3"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.025147 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerStarted","Data":"cd977ceb1ee8ac9e9d8cfb561512ee335af388b7dd19acb55ddaec492281722a"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.028343 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"83271e63a11c7874f6bc12c5e162895af826ebec23720f3e7be635d16973683b"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.032064 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"ffb5e995226a94702f1064e670a21e0075a72a8bc8c07e3cbf4f6f0329faf269"} Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.054228 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=16.285166252 podStartE2EDuration="37.054195896s" podCreationTimestamp="2026-03-22 00:34:44 +0000 UTC" firstStartedPulling="2026-03-22 00:34:58.957574602 +0000 UTC m=+1569.979875975" lastFinishedPulling="2026-03-22 00:35:19.726604246 +0000 UTC m=+1590.748905619" observedRunningTime="2026-03-22 00:35:21.044379769 +0000 UTC m=+1592.066681162" watchObservedRunningTime="2026-03-22 00:35:21.054195896 +0000 UTC m=+1592.076497309" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.082097 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.083257 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" podStartSLOduration=2.832020778 podStartE2EDuration="9.083218475s" podCreationTimestamp="2026-03-22 00:35:12 +0000 UTC" firstStartedPulling="2026-03-22 00:35:13.916429729 +0000 UTC m=+1584.938731102" lastFinishedPulling="2026-03-22 00:35:20.167627436 +0000 UTC m=+1591.189928799" observedRunningTime="2026-03-22 00:35:21.078753495 +0000 UTC m=+1592.101054888" watchObservedRunningTime="2026-03-22 00:35:21.083218475 +0000 UTC m=+1592.105519858" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.098117 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" podStartSLOduration=4.619487843 podStartE2EDuration="17.098095611s" podCreationTimestamp="2026-03-22 00:35:04 +0000 UTC" firstStartedPulling="2026-03-22 00:35:07.614314246 +0000 UTC m=+1578.636615619" lastFinishedPulling="2026-03-22 00:35:20.092922014 +0000 UTC m=+1591.115223387" observedRunningTime="2026-03-22 00:35:21.095268112 +0000 UTC m=+1592.117569495" watchObservedRunningTime="2026-03-22 00:35:21.098095611 +0000 UTC m=+1592.120396994" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.150130 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" podStartSLOduration=8.911133321 podStartE2EDuration="23.150104761s" podCreationTimestamp="2026-03-22 00:34:58 +0000 UTC" firstStartedPulling="2026-03-22 00:35:05.496958359 +0000 UTC m=+1576.519259732" lastFinishedPulling="2026-03-22 00:35:19.735929799 +0000 UTC m=+1590.758231172" observedRunningTime="2026-03-22 00:35:21.119780311 +0000 UTC m=+1592.142081724" watchObservedRunningTime="2026-03-22 00:35:21.150104761 +0000 UTC m=+1592.172406134" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.158091 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" podStartSLOduration=7.8463985659999995 podStartE2EDuration="8.158068521s" podCreationTimestamp="2026-03-22 00:35:13 +0000 UTC" firstStartedPulling="2026-03-22 00:35:20.041402301 +0000 UTC m=+1591.063703674" lastFinishedPulling="2026-03-22 00:35:20.353072256 +0000 UTC m=+1591.375373629" observedRunningTime="2026-03-22 00:35:21.137355051 +0000 UTC m=+1592.159656424" watchObservedRunningTime="2026-03-22 00:35:21.158068521 +0000 UTC m=+1592.180369894" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.181629 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" podStartSLOduration=5.833926723 podStartE2EDuration="20.181452184s" podCreationTimestamp="2026-03-22 00:35:01 +0000 UTC" firstStartedPulling="2026-03-22 00:35:05.49793427 +0000 UTC m=+1576.520235643" lastFinishedPulling="2026-03-22 00:35:19.845459731 +0000 UTC m=+1590.867761104" observedRunningTime="2026-03-22 00:35:21.157297106 +0000 UTC m=+1592.179598479" watchObservedRunningTime="2026-03-22 00:35:21.181452184 +0000 UTC m=+1592.203753557" Mar 22 00:35:21 crc kubenswrapper[5116]: I0322 00:35:21.227895 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.045997 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xvfm2" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="registry-server" containerID="cri-o://4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d" gracePeriod=2 Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.434811 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.524643 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsvs4\" (UniqueName: \"kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4\") pod \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.524690 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content\") pod \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.524761 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities\") pod \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\" (UID: \"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef\") " Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.525670 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities" (OuterVolumeSpecName: "utilities") pod "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" (UID: "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.532900 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4" (OuterVolumeSpecName: "kube-api-access-fsvs4") pod "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" (UID: "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef"). InnerVolumeSpecName "kube-api-access-fsvs4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.573634 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" (UID: "42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.626121 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fsvs4\" (UniqueName: \"kubernetes.io/projected/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-kube-api-access-fsvs4\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.626177 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:23 crc kubenswrapper[5116]: I0322 00:35:23.626187 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.053936 5116 generic.go:358] "Generic (PLEG): container finished" podID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerID="4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d" exitCode=0 Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.054042 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xvfm2" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.054079 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerDied","Data":"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d"} Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.054132 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xvfm2" event={"ID":"42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef","Type":"ContainerDied","Data":"7100f58fc42898dd88286865a83e861cb4a7d901326e4759acb51b3639cfe743"} Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.054156 5116 scope.go:117] "RemoveContainer" containerID="4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.077209 5116 scope.go:117] "RemoveContainer" containerID="c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.088331 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.100896 5116 scope.go:117] "RemoveContainer" containerID="dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.114048 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xvfm2"] Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.130747 5116 scope.go:117] "RemoveContainer" containerID="4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d" Mar 22 00:35:24 crc kubenswrapper[5116]: E0322 00:35:24.131123 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d\": container with ID starting with 4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d not found: ID does not exist" containerID="4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.131181 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d"} err="failed to get container status \"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d\": rpc error: code = NotFound desc = could not find container \"4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d\": container with ID starting with 4700454666bd0a629b3d6fec2cc8d00c505815de21b47a5afdec2e2db73c721d not found: ID does not exist" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.131208 5116 scope.go:117] "RemoveContainer" containerID="c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73" Mar 22 00:35:24 crc kubenswrapper[5116]: E0322 00:35:24.131450 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73\": container with ID starting with c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73 not found: ID does not exist" containerID="c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.131467 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73"} err="failed to get container status \"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73\": rpc error: code = NotFound desc = could not find container \"c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73\": container with ID starting with c8454fe022e8d1847a832e3350b9f001ecee9cb0a1e9553e14a376192c1d2a73 not found: ID does not exist" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.131479 5116 scope.go:117] "RemoveContainer" containerID="dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9" Mar 22 00:35:24 crc kubenswrapper[5116]: E0322 00:35:24.131630 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9\": container with ID starting with dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9 not found: ID does not exist" containerID="dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9" Mar 22 00:35:24 crc kubenswrapper[5116]: I0322 00:35:24.131643 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9"} err="failed to get container status \"dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9\": rpc error: code = NotFound desc = could not find container \"dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9\": container with ID starting with dd4d6a7f7183a7d7bf2df9dd3a56d239ecc27e8ad7a7326e646d8f570dd97db9 not found: ID does not exist" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.064261 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.064685 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" podUID="e29f610d-8ef1-4992-857d-7b39f8694e44" containerName="default-interconnect" containerID="cri-o://13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e" gracePeriod=30 Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.439583 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.467100 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-9ppbp"] Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469003 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e29f610d-8ef1-4992-857d-7b39f8694e44" containerName="default-interconnect" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469050 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29f610d-8ef1-4992-857d-7b39f8694e44" containerName="default-interconnect" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469080 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="registry-server" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469088 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="registry-server" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469100 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="extract-utilities" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469106 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="extract-utilities" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469142 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="extract-content" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469147 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="extract-content" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469269 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" containerName="registry-server" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.469282 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="e29f610d-8ef1-4992-857d-7b39f8694e44" containerName="default-interconnect" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.473735 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.484120 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-9ppbp"] Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.554809 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.554981 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555076 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555102 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssw5p\" (UniqueName: \"kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555131 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555191 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555348 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials\") pod \"e29f610d-8ef1-4992-857d-7b39f8694e44\" (UID: \"e29f610d-8ef1-4992-857d-7b39f8694e44\") " Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555469 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555517 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555580 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5v4m\" (UniqueName: \"kubernetes.io/projected/8d54082f-1922-421a-85c0-77b5d30d7e68-kube-api-access-t5v4m\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555608 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555665 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-config\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555720 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.555804 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-users\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.558196 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.561473 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.562307 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p" (OuterVolumeSpecName: "kube-api-access-ssw5p") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "kube-api-access-ssw5p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.569310 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.570140 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.571301 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.578347 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "e29f610d-8ef1-4992-857d-7b39f8694e44" (UID: "e29f610d-8ef1-4992-857d-7b39f8694e44"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.657327 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5v4m\" (UniqueName: \"kubernetes.io/projected/8d54082f-1922-421a-85c0-77b5d30d7e68-kube-api-access-t5v4m\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.657369 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.657397 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-config\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.657988 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658225 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-users\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658314 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658381 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658491 5116 reconciler_common.go:299] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658513 5116 reconciler_common.go:299] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658529 5116 reconciler_common.go:299] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658543 5116 reconciler_common.go:299] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658558 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ssw5p\" (UniqueName: \"kubernetes.io/projected/e29f610d-8ef1-4992-857d-7b39f8694e44-kube-api-access-ssw5p\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658572 5116 reconciler_common.go:299] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.658585 5116 reconciler_common.go:299] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/e29f610d-8ef1-4992-857d-7b39f8694e44-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.660643 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.660983 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-config\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.662674 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-inter-router-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.662895 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-credentials\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.663969 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-sasl-users\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.666084 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/8d54082f-1922-421a-85c0-77b5d30d7e68-default-interconnect-openstack-ca\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.676995 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5v4m\" (UniqueName: \"kubernetes.io/projected/8d54082f-1922-421a-85c0-77b5d30d7e68-kube-api-access-t5v4m\") pod \"default-interconnect-55bf8d5cb-9ppbp\" (UID: \"8d54082f-1922-421a-85c0-77b5d30d7e68\") " pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.709285 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef" path="/var/lib/kubelet/pods/42719985-4c2c-4ea4-bdc8-7ab90e6aa9ef/volumes" Mar 22 00:35:25 crc kubenswrapper[5116]: I0322 00:35:25.788549 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" Mar 22 00:35:26 crc kubenswrapper[5116]: E0322 00:35:26.062975 5116 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2a84a49_cbcf_41dd_8ed2_1df6cd7db259.slice/crio-conmon-75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e2f0d9c_7207_4164_a27f_9efeba6e22bb.slice/crio-conmon-3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e.scope\": RecentStats: unable to find data in memory cache]" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.084875 5116 generic.go:358] "Generic (PLEG): container finished" podID="8e2f0d9c-7207-4164-a27f-9efeba6e22bb" containerID="3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e" exitCode=0 Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.085022 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerDied","Data":"3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.085687 5116 scope.go:117] "RemoveContainer" containerID="3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.091891 5116 generic.go:358] "Generic (PLEG): container finished" podID="e29f610d-8ef1-4992-857d-7b39f8694e44" containerID="13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e" exitCode=0 Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.091947 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.092254 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" event={"ID":"e29f610d-8ef1-4992-857d-7b39f8694e44","Type":"ContainerDied","Data":"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.092292 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-r7j5z" event={"ID":"e29f610d-8ef1-4992-857d-7b39f8694e44","Type":"ContainerDied","Data":"bf3991c3ece1f301b0d1fccfff351a4b6d180bae7f391fffa01dbd097f35a02d"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.092322 5116 scope.go:117] "RemoveContainer" containerID="13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.102381 5116 generic.go:358] "Generic (PLEG): container finished" podID="c2a84a49-cbcf-41dd-8ed2-1df6cd7db259" containerID="75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042" exitCode=0 Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.102487 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerDied","Data":"75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.110054 5116 scope.go:117] "RemoveContainer" containerID="75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.116959 5116 generic.go:358] "Generic (PLEG): container finished" podID="874ff775-e791-4357-9719-a773b3a8e4d8" containerID="bdc01262a2be3743c259f8f0138badb55dc7faf715baa5327a31fb3866566618" exitCode=0 Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.117044 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerDied","Data":"bdc01262a2be3743c259f8f0138badb55dc7faf715baa5327a31fb3866566618"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.117516 5116 scope.go:117] "RemoveContainer" containerID="bdc01262a2be3743c259f8f0138badb55dc7faf715baa5327a31fb3866566618" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.138179 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.146420 5116 generic.go:358] "Generic (PLEG): container finished" podID="f826e257-238f-4715-8010-f30569577292" containerID="cd977ceb1ee8ac9e9d8cfb561512ee335af388b7dd19acb55ddaec492281722a" exitCode=0 Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.146495 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerDied","Data":"cd977ceb1ee8ac9e9d8cfb561512ee335af388b7dd19acb55ddaec492281722a"} Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.147065 5116 scope.go:117] "RemoveContainer" containerID="cd977ceb1ee8ac9e9d8cfb561512ee335af388b7dd19acb55ddaec492281722a" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.148868 5116 scope.go:117] "RemoveContainer" containerID="13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.148991 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-r7j5z"] Mar 22 00:35:26 crc kubenswrapper[5116]: E0322 00:35:26.154255 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e\": container with ID starting with 13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e not found: ID does not exist" containerID="13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.154305 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e"} err="failed to get container status \"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e\": rpc error: code = NotFound desc = could not find container \"13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e\": container with ID starting with 13efdb7b09dbacea618ae6e5e926f99c6af7e489c0aef05534fdc0f2a748af1e not found: ID does not exist" Mar 22 00:35:26 crc kubenswrapper[5116]: I0322 00:35:26.260381 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-55bf8d5cb-9ppbp"] Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.156188 5116 generic.go:358] "Generic (PLEG): container finished" podID="1d6d17f0-5973-421e-9838-1a6195ca1731" containerID="f1ed95bed5bc469f431a016c5abaeb431a702448728bbeb1b68788f3fcb5b440" exitCode=0 Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.156305 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerDied","Data":"f1ed95bed5bc469f431a016c5abaeb431a702448728bbeb1b68788f3fcb5b440"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.157449 5116 scope.go:117] "RemoveContainer" containerID="f1ed95bed5bc469f431a016c5abaeb431a702448728bbeb1b68788f3fcb5b440" Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.161482 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"b189b9b524a3cf072ed63771382222efd8f4226bb25c64118c701ad857050c50"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.165994 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" event={"ID":"8d54082f-1922-421a-85c0-77b5d30d7e68","Type":"ContainerStarted","Data":"d57ef9bb49b3b11aaedb1cccd0272d3181c5da638914769f6096dfdd7d2ef4c4"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.166277 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" event={"ID":"8d54082f-1922-421a-85c0-77b5d30d7e68","Type":"ContainerStarted","Data":"cb26f4a368af69a8b7a008081c9816094bb65715cc0ab7a96bac4b42191a080a"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.170584 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"3125654e52f6752f2135b44ffeec172c9551d383e21ca203d1f3b90f217ad959"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.179674 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerStarted","Data":"080b0859ad06c007a338b7db7695f50a6d107982b0b16937055ece2ed69d5f4c"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.182456 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerStarted","Data":"c2a3eac07cc2d9a8898b3dd56f24b336dc8fca4c9810a0f5ec2fb846869f71cd"} Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.272185 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-55bf8d5cb-9ppbp" podStartSLOduration=2.272151151 podStartE2EDuration="2.272151151s" podCreationTimestamp="2026-03-22 00:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-22 00:35:27.261613852 +0000 UTC m=+1598.283915245" watchObservedRunningTime="2026-03-22 00:35:27.272151151 +0000 UTC m=+1598.294452524" Mar 22 00:35:27 crc kubenswrapper[5116]: I0322 00:35:27.704584 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e29f610d-8ef1-4992-857d-7b39f8694e44" path="/var/lib/kubelet/pods/e29f610d-8ef1-4992-857d-7b39f8694e44/volumes" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.198772 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"e7939b59ddf522ad2c8accf1d15d7c82f6217cc6ee65b4d9abe20e1ec7f9f366"} Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.209190 5116 generic.go:358] "Generic (PLEG): container finished" podID="8e2f0d9c-7207-4164-a27f-9efeba6e22bb" containerID="b189b9b524a3cf072ed63771382222efd8f4226bb25c64118c701ad857050c50" exitCode=0 Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.209419 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerDied","Data":"b189b9b524a3cf072ed63771382222efd8f4226bb25c64118c701ad857050c50"} Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.209455 5116 scope.go:117] "RemoveContainer" containerID="3a5ae25ea0289caa78d46760494110691c8efbbb0519cede3b2af027cdfa9f7e" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.209925 5116 scope.go:117] "RemoveContainer" containerID="b189b9b524a3cf072ed63771382222efd8f4226bb25c64118c701ad857050c50" Mar 22 00:35:28 crc kubenswrapper[5116]: E0322 00:35:28.210161 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l_service-telemetry(8e2f0d9c-7207-4164-a27f-9efeba6e22bb)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" podUID="8e2f0d9c-7207-4164-a27f-9efeba6e22bb" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.214781 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerDied","Data":"3125654e52f6752f2135b44ffeec172c9551d383e21ca203d1f3b90f217ad959"} Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.215883 5116 scope.go:117] "RemoveContainer" containerID="3125654e52f6752f2135b44ffeec172c9551d383e21ca203d1f3b90f217ad959" Mar 22 00:35:28 crc kubenswrapper[5116]: E0322 00:35:28.216370 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q_service-telemetry(c2a84a49-cbcf-41dd-8ed2-1df6cd7db259)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" podUID="c2a84a49-cbcf-41dd-8ed2-1df6cd7db259" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.212468 5116 generic.go:358] "Generic (PLEG): container finished" podID="c2a84a49-cbcf-41dd-8ed2-1df6cd7db259" containerID="3125654e52f6752f2135b44ffeec172c9551d383e21ca203d1f3b90f217ad959" exitCode=0 Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.261050 5116 generic.go:358] "Generic (PLEG): container finished" podID="874ff775-e791-4357-9719-a773b3a8e4d8" containerID="080b0859ad06c007a338b7db7695f50a6d107982b0b16937055ece2ed69d5f4c" exitCode=0 Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.261190 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerDied","Data":"080b0859ad06c007a338b7db7695f50a6d107982b0b16937055ece2ed69d5f4c"} Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.261459 5116 scope.go:117] "RemoveContainer" containerID="75b36066b06077650c7f84b0f9fd6c64cfd5d0637d4554891130385df8f38042" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.262162 5116 scope.go:117] "RemoveContainer" containerID="080b0859ad06c007a338b7db7695f50a6d107982b0b16937055ece2ed69d5f4c" Mar 22 00:35:28 crc kubenswrapper[5116]: E0322 00:35:28.262471 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6_service-telemetry(874ff775-e791-4357-9719-a773b3a8e4d8)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" podUID="874ff775-e791-4357-9719-a773b3a8e4d8" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.276188 5116 generic.go:358] "Generic (PLEG): container finished" podID="f826e257-238f-4715-8010-f30569577292" containerID="c2a3eac07cc2d9a8898b3dd56f24b336dc8fca4c9810a0f5ec2fb846869f71cd" exitCode=0 Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.276991 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerDied","Data":"c2a3eac07cc2d9a8898b3dd56f24b336dc8fca4c9810a0f5ec2fb846869f71cd"} Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.277352 5116 scope.go:117] "RemoveContainer" containerID="c2a3eac07cc2d9a8898b3dd56f24b336dc8fca4c9810a0f5ec2fb846869f71cd" Mar 22 00:35:28 crc kubenswrapper[5116]: E0322 00:35:28.277596 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-664479c99b-v2bfp_service-telemetry(f826e257-238f-4715-8010-f30569577292)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" podUID="f826e257-238f-4715-8010-f30569577292" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.296516 5116 scope.go:117] "RemoveContainer" containerID="bdc01262a2be3743c259f8f0138badb55dc7faf715baa5327a31fb3866566618" Mar 22 00:35:28 crc kubenswrapper[5116]: I0322 00:35:28.343379 5116 scope.go:117] "RemoveContainer" containerID="cd977ceb1ee8ac9e9d8cfb561512ee335af388b7dd19acb55ddaec492281722a" Mar 22 00:35:29 crc kubenswrapper[5116]: I0322 00:35:29.292886 5116 generic.go:358] "Generic (PLEG): container finished" podID="1d6d17f0-5973-421e-9838-1a6195ca1731" containerID="e7939b59ddf522ad2c8accf1d15d7c82f6217cc6ee65b4d9abe20e1ec7f9f366" exitCode=0 Mar 22 00:35:29 crc kubenswrapper[5116]: I0322 00:35:29.293021 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerDied","Data":"e7939b59ddf522ad2c8accf1d15d7c82f6217cc6ee65b4d9abe20e1ec7f9f366"} Mar 22 00:35:29 crc kubenswrapper[5116]: I0322 00:35:29.293147 5116 scope.go:117] "RemoveContainer" containerID="f1ed95bed5bc469f431a016c5abaeb431a702448728bbeb1b68788f3fcb5b440" Mar 22 00:35:29 crc kubenswrapper[5116]: I0322 00:35:29.293677 5116 scope.go:117] "RemoveContainer" containerID="e7939b59ddf522ad2c8accf1d15d7c82f6217cc6ee65b4d9abe20e1ec7f9f366" Mar 22 00:35:29 crc kubenswrapper[5116]: E0322 00:35:29.294061 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5_service-telemetry(1d6d17f0-5973-421e-9838-1a6195ca1731)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" podUID="1d6d17f0-5973-421e-9838-1a6195ca1731" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.017320 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.027597 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.036795 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"service-telemetry\"/\"default-interconnect-selfsigned\"" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.037014 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"qdr-test-config\"" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.044000 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.162430 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/90868bad-f18e-4266-a1cf-e633cccb0c4b-qdr-test-config\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.162755 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q79cf\" (UniqueName: \"kubernetes.io/projected/90868bad-f18e-4266-a1cf-e633cccb0c4b-kube-api-access-q79cf\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.162798 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/90868bad-f18e-4266-a1cf-e633cccb0c4b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.264138 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q79cf\" (UniqueName: \"kubernetes.io/projected/90868bad-f18e-4266-a1cf-e633cccb0c4b-kube-api-access-q79cf\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.264232 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/90868bad-f18e-4266-a1cf-e633cccb0c4b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.264306 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/90868bad-f18e-4266-a1cf-e633cccb0c4b-qdr-test-config\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.265095 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/90868bad-f18e-4266-a1cf-e633cccb0c4b-qdr-test-config\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.271928 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/90868bad-f18e-4266-a1cf-e633cccb0c4b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.279307 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q79cf\" (UniqueName: \"kubernetes.io/projected/90868bad-f18e-4266-a1cf-e633cccb0c4b-kube-api-access-q79cf\") pod \"qdr-test\" (UID: \"90868bad-f18e-4266-a1cf-e633cccb0c4b\") " pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.351175 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 22 00:35:31 crc kubenswrapper[5116]: I0322 00:35:31.562599 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 22 00:35:31 crc kubenswrapper[5116]: W0322 00:35:31.572080 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90868bad_f18e_4266_a1cf_e633cccb0c4b.slice/crio-4308b575ec5b0569bef535090541e86d108cbd93df57b8268c7b07c6239f6491 WatchSource:0}: Error finding container 4308b575ec5b0569bef535090541e86d108cbd93df57b8268c7b07c6239f6491: Status 404 returned error can't find the container with id 4308b575ec5b0569bef535090541e86d108cbd93df57b8268c7b07c6239f6491 Mar 22 00:35:32 crc kubenswrapper[5116]: I0322 00:35:32.319730 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"90868bad-f18e-4266-a1cf-e633cccb0c4b","Type":"ContainerStarted","Data":"4308b575ec5b0569bef535090541e86d108cbd93df57b8268c7b07c6239f6491"} Mar 22 00:35:33 crc kubenswrapper[5116]: I0322 00:35:33.697036 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:35:33 crc kubenswrapper[5116]: E0322 00:35:33.697534 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.377702 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"90868bad-f18e-4266-a1cf-e633cccb0c4b","Type":"ContainerStarted","Data":"617a091a99cd55ca5a220878be801e2fb93ede016235ce0e811d04e68be20306"} Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.401765 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=1.13611172 podStartE2EDuration="8.401742714s" podCreationTimestamp="2026-03-22 00:35:31 +0000 UTC" firstStartedPulling="2026-03-22 00:35:31.575572407 +0000 UTC m=+1602.597873780" lastFinishedPulling="2026-03-22 00:35:38.841203401 +0000 UTC m=+1609.863504774" observedRunningTime="2026-03-22 00:35:39.39456761 +0000 UTC m=+1610.416869003" watchObservedRunningTime="2026-03-22 00:35:39.401742714 +0000 UTC m=+1610.424044097" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.712856 5116 scope.go:117] "RemoveContainer" containerID="c2a3eac07cc2d9a8898b3dd56f24b336dc8fca4c9810a0f5ec2fb846869f71cd" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.713217 5116 scope.go:117] "RemoveContainer" containerID="b189b9b524a3cf072ed63771382222efd8f4226bb25c64118c701ad857050c50" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.713684 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-fq6th"] Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.713724 5116 scope.go:117] "RemoveContainer" containerID="3125654e52f6752f2135b44ffeec172c9551d383e21ca203d1f3b90f217ad959" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.737257 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-fq6th"] Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.737538 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.740688 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-ceilometer-entrypoint-script\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.740930 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-ceilometer-publisher\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.740718 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-collectd-entrypoint-script\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.740753 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-sensubility-config\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.740688 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-healthcheck-log\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.741632 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"service-telemetry\"/\"stf-smoketest-collectd-config\"" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892410 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892761 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb52l\" (UniqueName: \"kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892802 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892834 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892871 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.892954 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.893016 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.994884 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.994940 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.994976 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.995017 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.995126 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.995460 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.995564 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb52l\" (UniqueName: \"kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.996419 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.996532 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.996997 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.997444 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:39 crc kubenswrapper[5116]: I0322 00:35:39.998656 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.000021 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.022924 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb52l\" (UniqueName: \"kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l\") pod \"stf-smoketest-smoke1-fq6th\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.068779 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.208026 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.212226 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.214785 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.304489 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng9pg\" (UniqueName: \"kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg\") pod \"curl\" (UID: \"08818159-a850-4f98-8d52-3685351c96e4\") " pod="service-telemetry/curl" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.388061 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q" event={"ID":"c2a84a49-cbcf-41dd-8ed2-1df6cd7db259","Type":"ContainerStarted","Data":"cc828f581887fc1fadfbecd7b268fbbeb23585aeb83436fe041c341f09db5496"} Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.392662 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-664479c99b-v2bfp" event={"ID":"f826e257-238f-4715-8010-f30569577292","Type":"ContainerStarted","Data":"ade617f3251dcbda4dad6917f454d25bd36f6f4c3c96db162d9b2dd2dc267aa5"} Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.395194 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l" event={"ID":"8e2f0d9c-7207-4164-a27f-9efeba6e22bb","Type":"ContainerStarted","Data":"4681bd487cb8451d044892f561b12bb15a002abb5bdee0edd0e114e0905781af"} Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.407396 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng9pg\" (UniqueName: \"kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg\") pod \"curl\" (UID: \"08818159-a850-4f98-8d52-3685351c96e4\") " pod="service-telemetry/curl" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.440582 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng9pg\" (UniqueName: \"kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg\") pod \"curl\" (UID: \"08818159-a850-4f98-8d52-3685351c96e4\") " pod="service-telemetry/curl" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.541784 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.623444 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-fq6th"] Mar 22 00:35:40 crc kubenswrapper[5116]: I0322 00:35:40.990486 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 22 00:35:41 crc kubenswrapper[5116]: I0322 00:35:41.404510 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerStarted","Data":"aceab4cbda2cdab69ea549cea0457aaa095384a3a7237572e46e8cc591a43c67"} Mar 22 00:35:41 crc kubenswrapper[5116]: I0322 00:35:41.406215 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"08818159-a850-4f98-8d52-3685351c96e4","Type":"ContainerStarted","Data":"e583baa57c9a16bc4d1316f91cda6f5dca1ec0de1b504d701235d1a37f93667f"} Mar 22 00:35:41 crc kubenswrapper[5116]: I0322 00:35:41.697880 5116 scope.go:117] "RemoveContainer" containerID="e7939b59ddf522ad2c8accf1d15d7c82f6217cc6ee65b4d9abe20e1ec7f9f366" Mar 22 00:35:41 crc kubenswrapper[5116]: I0322 00:35:41.697978 5116 scope.go:117] "RemoveContainer" containerID="080b0859ad06c007a338b7db7695f50a6d107982b0b16937055ece2ed69d5f4c" Mar 22 00:35:43 crc kubenswrapper[5116]: I0322 00:35:43.425127 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6" event={"ID":"874ff775-e791-4357-9719-a773b3a8e4d8","Type":"ContainerStarted","Data":"410b6262c50815a5ed33819dc3bec1eaa9ed41ec7966e6426b7941c43d7c633e"} Mar 22 00:35:43 crc kubenswrapper[5116]: I0322 00:35:43.431444 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5" event={"ID":"1d6d17f0-5973-421e-9838-1a6195ca1731","Type":"ContainerStarted","Data":"674c7c53f050eae528d2d61757a061f140572ded3bc868f29fd51fd161ab8cbe"} Mar 22 00:35:43 crc kubenswrapper[5116]: I0322 00:35:43.435401 5116 generic.go:358] "Generic (PLEG): container finished" podID="08818159-a850-4f98-8d52-3685351c96e4" containerID="5a89a5faf89fe511861b949dea89be2304c3b4ec8557f73e3c852a8b6c3274c7" exitCode=0 Mar 22 00:35:43 crc kubenswrapper[5116]: I0322 00:35:43.435578 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"08818159-a850-4f98-8d52-3685351c96e4","Type":"ContainerDied","Data":"5a89a5faf89fe511861b949dea89be2304c3b4ec8557f73e3c852a8b6c3274c7"} Mar 22 00:35:47 crc kubenswrapper[5116]: I0322 00:35:47.699065 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:35:47 crc kubenswrapper[5116]: E0322 00:35:47.700233 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:35:48 crc kubenswrapper[5116]: I0322 00:35:48.654348 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 22 00:35:48 crc kubenswrapper[5116]: I0322 00:35:48.825653 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_08818159-a850-4f98-8d52-3685351c96e4/curl/0.log" Mar 22 00:35:48 crc kubenswrapper[5116]: I0322 00:35:48.842936 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng9pg\" (UniqueName: \"kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg\") pod \"08818159-a850-4f98-8d52-3685351c96e4\" (UID: \"08818159-a850-4f98-8d52-3685351c96e4\") " Mar 22 00:35:48 crc kubenswrapper[5116]: I0322 00:35:48.853994 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg" (OuterVolumeSpecName: "kube-api-access-ng9pg") pod "08818159-a850-4f98-8d52-3685351c96e4" (UID: "08818159-a850-4f98-8d52-3685351c96e4"). InnerVolumeSpecName "kube-api-access-ng9pg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:35:48 crc kubenswrapper[5116]: I0322 00:35:48.944657 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ng9pg\" (UniqueName: \"kubernetes.io/projected/08818159-a850-4f98-8d52-3685351c96e4-kube-api-access-ng9pg\") on node \"crc\" DevicePath \"\"" Mar 22 00:35:49 crc kubenswrapper[5116]: I0322 00:35:49.091091 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-694dc457d5-77wwx_536dbd5d-337e-43fa-925a-e88d3be7da06/prometheus-webhook-snmp/0.log" Mar 22 00:35:49 crc kubenswrapper[5116]: I0322 00:35:49.487950 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"08818159-a850-4f98-8d52-3685351c96e4","Type":"ContainerDied","Data":"e583baa57c9a16bc4d1316f91cda6f5dca1ec0de1b504d701235d1a37f93667f"} Mar 22 00:35:49 crc kubenswrapper[5116]: I0322 00:35:49.487994 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e583baa57c9a16bc4d1316f91cda6f5dca1ec0de1b504d701235d1a37f93667f" Mar 22 00:35:49 crc kubenswrapper[5116]: I0322 00:35:49.488056 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 22 00:35:53 crc kubenswrapper[5116]: I0322 00:35:53.521689 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerStarted","Data":"e5cae1dfbc260e122eb67f28931ab2ad11120b5060aef39d786b1e78211663e3"} Mar 22 00:35:58 crc kubenswrapper[5116]: I0322 00:35:58.561273 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerStarted","Data":"2481a3fbab199fb083c39362d71c9c80752d96cb67c6ae685e405b92d8ea560d"} Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.144725 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-fq6th" podStartSLOduration=3.9207053800000002 podStartE2EDuration="21.144697112s" podCreationTimestamp="2026-03-22 00:35:39 +0000 UTC" firstStartedPulling="2026-03-22 00:35:40.646479028 +0000 UTC m=+1611.668780401" lastFinishedPulling="2026-03-22 00:35:57.87047076 +0000 UTC m=+1628.892772133" observedRunningTime="2026-03-22 00:35:58.586451716 +0000 UTC m=+1629.608753109" watchObservedRunningTime="2026-03-22 00:36:00.144697112 +0000 UTC m=+1631.166998525" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.154693 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568996-kb9ll"] Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.157609 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08818159-a850-4f98-8d52-3685351c96e4" containerName="curl" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.157859 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="08818159-a850-4f98-8d52-3685351c96e4" containerName="curl" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.158402 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="08818159-a850-4f98-8d52-3685351c96e4" containerName="curl" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.166219 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568996-kb9ll"] Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.166440 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.169572 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.170031 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.170382 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.338154 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zj4j\" (UniqueName: \"kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j\") pod \"auto-csr-approver-29568996-kb9ll\" (UID: \"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a\") " pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.440129 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zj4j\" (UniqueName: \"kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j\") pod \"auto-csr-approver-29568996-kb9ll\" (UID: \"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a\") " pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.473227 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zj4j\" (UniqueName: \"kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j\") pod \"auto-csr-approver-29568996-kb9ll\" (UID: \"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a\") " pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.500993 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:00 crc kubenswrapper[5116]: I0322 00:36:00.774271 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568996-kb9ll"] Mar 22 00:36:00 crc kubenswrapper[5116]: W0322 00:36:00.778377 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a2fcae9_bfc7_4f9f_8c49_bac76c3a652a.slice/crio-619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a WatchSource:0}: Error finding container 619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a: Status 404 returned error can't find the container with id 619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a Mar 22 00:36:01 crc kubenswrapper[5116]: I0322 00:36:01.591373 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" event={"ID":"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a","Type":"ContainerStarted","Data":"619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a"} Mar 22 00:36:02 crc kubenswrapper[5116]: I0322 00:36:02.604369 5116 generic.go:358] "Generic (PLEG): container finished" podID="0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" containerID="688a5bb9aa102b8d7fbc699af35c11b9bcfdddc98bc6964ddb5e59d34be83553" exitCode=0 Mar 22 00:36:02 crc kubenswrapper[5116]: I0322 00:36:02.604596 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" event={"ID":"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a","Type":"ContainerDied","Data":"688a5bb9aa102b8d7fbc699af35c11b9bcfdddc98bc6964ddb5e59d34be83553"} Mar 22 00:36:02 crc kubenswrapper[5116]: I0322 00:36:02.697677 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:36:02 crc kubenswrapper[5116]: E0322 00:36:02.698076 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:36:03 crc kubenswrapper[5116]: I0322 00:36:03.948935 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.009002 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zj4j\" (UniqueName: \"kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j\") pod \"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a\" (UID: \"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a\") " Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.020527 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j" (OuterVolumeSpecName: "kube-api-access-5zj4j") pod "0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" (UID: "0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a"). InnerVolumeSpecName "kube-api-access-5zj4j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.110918 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5zj4j\" (UniqueName: \"kubernetes.io/projected/0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a-kube-api-access-5zj4j\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.633663 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.633710 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568996-kb9ll" event={"ID":"0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a","Type":"ContainerDied","Data":"619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a"} Mar 22 00:36:04 crc kubenswrapper[5116]: I0322 00:36:04.633768 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="619d93ceb49d6f15b82900f39d8b1db1a0fc7f4d04996436b877c7d92977225a" Mar 22 00:36:05 crc kubenswrapper[5116]: I0322 00:36:05.041376 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568990-vx25q"] Mar 22 00:36:05 crc kubenswrapper[5116]: I0322 00:36:05.052124 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568990-vx25q"] Mar 22 00:36:05 crc kubenswrapper[5116]: I0322 00:36:05.714063 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753649c7-f19a-4b90-a29a-2108a691e934" path="/var/lib/kubelet/pods/753649c7-f19a-4b90-a29a-2108a691e934/volumes" Mar 22 00:36:16 crc kubenswrapper[5116]: I0322 00:36:16.697090 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:36:16 crc kubenswrapper[5116]: E0322 00:36:16.699451 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:36:19 crc kubenswrapper[5116]: I0322 00:36:19.403747 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-694dc457d5-77wwx_536dbd5d-337e-43fa-925a-e88d3be7da06/prometheus-webhook-snmp/0.log" Mar 22 00:36:27 crc kubenswrapper[5116]: I0322 00:36:27.841596 5116 generic.go:358] "Generic (PLEG): container finished" podID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerID="e5cae1dfbc260e122eb67f28931ab2ad11120b5060aef39d786b1e78211663e3" exitCode=0 Mar 22 00:36:27 crc kubenswrapper[5116]: I0322 00:36:27.841658 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerDied","Data":"e5cae1dfbc260e122eb67f28931ab2ad11120b5060aef39d786b1e78211663e3"} Mar 22 00:36:27 crc kubenswrapper[5116]: I0322 00:36:27.842770 5116 scope.go:117] "RemoveContainer" containerID="e5cae1dfbc260e122eb67f28931ab2ad11120b5060aef39d786b1e78211663e3" Mar 22 00:36:29 crc kubenswrapper[5116]: I0322 00:36:29.860665 5116 generic.go:358] "Generic (PLEG): container finished" podID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerID="2481a3fbab199fb083c39362d71c9c80752d96cb67c6ae685e405b92d8ea560d" exitCode=0 Mar 22 00:36:29 crc kubenswrapper[5116]: I0322 00:36:29.860750 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerDied","Data":"2481a3fbab199fb083c39362d71c9c80752d96cb67c6ae685e405b92d8ea560d"} Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.153601 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236241 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236337 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236471 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236509 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236541 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236637 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb52l\" (UniqueName: \"kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.236770 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config\") pod \"5256051f-7482-42a0-921d-0f3073e5f5fb\" (UID: \"5256051f-7482-42a0-921d-0f3073e5f5fb\") " Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.243628 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l" (OuterVolumeSpecName: "kube-api-access-vb52l") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "kube-api-access-vb52l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.257386 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.258812 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.259210 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.260559 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.270308 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.271042 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "5256051f-7482-42a0-921d-0f3073e5f5fb" (UID: "5256051f-7482-42a0-921d-0f3073e5f5fb"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338162 5116 reconciler_common.go:299] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338238 5116 reconciler_common.go:299] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338252 5116 reconciler_common.go:299] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338267 5116 reconciler_common.go:299] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338281 5116 reconciler_common.go:299] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338294 5116 reconciler_common.go:299] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/5256051f-7482-42a0-921d-0f3073e5f5fb-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.338306 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vb52l\" (UniqueName: \"kubernetes.io/projected/5256051f-7482-42a0-921d-0f3073e5f5fb-kube-api-access-vb52l\") on node \"crc\" DevicePath \"\"" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.698450 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:36:31 crc kubenswrapper[5116]: E0322 00:36:31.698997 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.887038 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-fq6th" event={"ID":"5256051f-7482-42a0-921d-0f3073e5f5fb","Type":"ContainerDied","Data":"aceab4cbda2cdab69ea549cea0457aaa095384a3a7237572e46e8cc591a43c67"} Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.887090 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aceab4cbda2cdab69ea549cea0457aaa095384a3a7237572e46e8cc591a43c67" Mar 22 00:36:31 crc kubenswrapper[5116]: I0322 00:36:31.887150 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-fq6th" Mar 22 00:36:33 crc kubenswrapper[5116]: I0322 00:36:33.220096 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-fq6th_5256051f-7482-42a0-921d-0f3073e5f5fb/smoketest-collectd/0.log" Mar 22 00:36:33 crc kubenswrapper[5116]: I0322 00:36:33.493125 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-fq6th_5256051f-7482-42a0-921d-0f3073e5f5fb/smoketest-ceilometer/0.log" Mar 22 00:36:33 crc kubenswrapper[5116]: I0322 00:36:33.785998 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-55bf8d5cb-9ppbp_8d54082f-1922-421a-85c0-77b5d30d7e68/default-interconnect/0.log" Mar 22 00:36:34 crc kubenswrapper[5116]: I0322 00:36:34.043000 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l_8e2f0d9c-7207-4164-a27f-9efeba6e22bb/bridge/2.log" Mar 22 00:36:34 crc kubenswrapper[5116]: I0322 00:36:34.308120 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7f8f5c6486-wwk8l_8e2f0d9c-7207-4164-a27f-9efeba6e22bb/sg-core/0.log" Mar 22 00:36:34 crc kubenswrapper[5116]: I0322 00:36:34.597828 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-664479c99b-v2bfp_f826e257-238f-4715-8010-f30569577292/bridge/2.log" Mar 22 00:36:34 crc kubenswrapper[5116]: I0322 00:36:34.879671 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-664479c99b-v2bfp_f826e257-238f-4715-8010-f30569577292/sg-core/0.log" Mar 22 00:36:35 crc kubenswrapper[5116]: I0322 00:36:35.145434 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q_c2a84a49-cbcf-41dd-8ed2-1df6cd7db259/bridge/2.log" Mar 22 00:36:35 crc kubenswrapper[5116]: I0322 00:36:35.441540 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-c9f4bb7dc-wvb4q_c2a84a49-cbcf-41dd-8ed2-1df6cd7db259/sg-core/0.log" Mar 22 00:36:35 crc kubenswrapper[5116]: I0322 00:36:35.757870 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6_874ff775-e791-4357-9719-a773b3a8e4d8/bridge/2.log" Mar 22 00:36:36 crc kubenswrapper[5116]: I0322 00:36:36.063565 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-7595b679bb-lh2b6_874ff775-e791-4357-9719-a773b3a8e4d8/sg-core/0.log" Mar 22 00:36:36 crc kubenswrapper[5116]: I0322 00:36:36.354544 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5_1d6d17f0-5973-421e-9838-1a6195ca1731/bridge/2.log" Mar 22 00:36:36 crc kubenswrapper[5116]: I0322 00:36:36.666782 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-58c78bbf69-cbmv5_1d6d17f0-5973-421e-9838-1a6195ca1731/sg-core/0.log" Mar 22 00:36:39 crc kubenswrapper[5116]: I0322 00:36:39.949456 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-564975b589-lf9hg_5f8e2cff-6459-4aae-8cf6-a48587311f68/operator/0.log" Mar 22 00:36:40 crc kubenswrapper[5116]: I0322 00:36:40.248407 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_bd2bf44b-01e8-4236-9dda-90998dd75b88/prometheus/0.log" Mar 22 00:36:40 crc kubenswrapper[5116]: I0322 00:36:40.531882 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_ccb103ab-2a74-44b8-b853-0da2e0b4a6b5/elasticsearch/0.log" Mar 22 00:36:40 crc kubenswrapper[5116]: I0322 00:36:40.859241 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-694dc457d5-77wwx_536dbd5d-337e-43fa-925a-e88d3be7da06/prometheus-webhook-snmp/0.log" Mar 22 00:36:41 crc kubenswrapper[5116]: I0322 00:36:41.205432 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_6a4e1e85-870e-408e-a4dd-c5a7d7fcecae/alertmanager/0.log" Mar 22 00:36:44 crc kubenswrapper[5116]: I0322 00:36:44.697533 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:36:44 crc kubenswrapper[5116]: E0322 00:36:44.699232 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:36:56 crc kubenswrapper[5116]: I0322 00:36:56.698424 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:36:56 crc kubenswrapper[5116]: E0322 00:36:56.699396 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:36:57 crc kubenswrapper[5116]: I0322 00:36:57.320767 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-55b77595c-67kjz_fcfe47c3-284e-4c8b-b35f-06a0499313a5/operator/0.log" Mar 22 00:36:57 crc kubenswrapper[5116]: I0322 00:36:57.796248 5116 scope.go:117] "RemoveContainer" containerID="85bd7d245d43142b219c3be5ac468eafb8ba3e6e6a34155393343c620d8b140b" Mar 22 00:37:00 crc kubenswrapper[5116]: I0322 00:37:00.902930 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-564975b589-lf9hg_5f8e2cff-6459-4aae-8cf6-a48587311f68/operator/0.log" Mar 22 00:37:01 crc kubenswrapper[5116]: I0322 00:37:01.200767 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_90868bad-f18e-4266-a1cf-e633cccb0c4b/qdr/0.log" Mar 22 00:37:11 crc kubenswrapper[5116]: I0322 00:37:11.697317 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:37:11 crc kubenswrapper[5116]: E0322 00:37:11.698484 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:37:23 crc kubenswrapper[5116]: I0322 00:37:23.697840 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:37:23 crc kubenswrapper[5116]: E0322 00:37:23.698869 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.200533 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mmn2b/must-gather-pxgv6"] Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202005 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" containerName="oc" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202036 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" containerName="oc" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202054 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-collectd" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202064 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-collectd" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202142 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-ceilometer" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202157 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-ceilometer" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202370 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a2fcae9-bfc7-4f9f-8c49-bac76c3a652a" containerName="oc" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202409 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-collectd" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.202427 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="5256051f-7482-42a0-921d-0f3073e5f5fb" containerName="smoketest-ceilometer" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.208406 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.210223 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mmn2b\"/\"openshift-service-ca.crt\"" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.211218 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mmn2b\"/\"kube-root-ca.crt\"" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.216016 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mmn2b/must-gather-pxgv6"] Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.353495 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czwdw\" (UniqueName: \"kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.353799 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.455705 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czwdw\" (UniqueName: \"kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.455799 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.456204 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.478455 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czwdw\" (UniqueName: \"kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw\") pod \"must-gather-pxgv6\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.540635 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:37:26 crc kubenswrapper[5116]: I0322 00:37:26.864053 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mmn2b/must-gather-pxgv6"] Mar 22 00:37:27 crc kubenswrapper[5116]: I0322 00:37:27.521762 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" event={"ID":"4fa843c6-f428-4c50-8fb6-58db4910b8d8","Type":"ContainerStarted","Data":"046ccfde5254025be74845f8d01e12de4bae01aa5036280050a66768e75af1f8"} Mar 22 00:37:33 crc kubenswrapper[5116]: I0322 00:37:33.576571 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" event={"ID":"4fa843c6-f428-4c50-8fb6-58db4910b8d8","Type":"ContainerStarted","Data":"c58c0ca6ac202eff124ef375a1ba501807cfd8934d46c29180611db4363c333e"} Mar 22 00:37:33 crc kubenswrapper[5116]: I0322 00:37:33.577320 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" event={"ID":"4fa843c6-f428-4c50-8fb6-58db4910b8d8","Type":"ContainerStarted","Data":"d73ca44ab765dc4012a9020a414c305067b5524903bd0e1b9b94504288d3f864"} Mar 22 00:37:33 crc kubenswrapper[5116]: I0322 00:37:33.595494 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" podStartSLOduration=1.621485078 podStartE2EDuration="7.595479431s" podCreationTimestamp="2026-03-22 00:37:26 +0000 UTC" firstStartedPulling="2026-03-22 00:37:26.875948609 +0000 UTC m=+1717.898249982" lastFinishedPulling="2026-03-22 00:37:32.849942972 +0000 UTC m=+1723.872244335" observedRunningTime="2026-03-22 00:37:33.594636125 +0000 UTC m=+1724.616937498" watchObservedRunningTime="2026-03-22 00:37:33.595479431 +0000 UTC m=+1724.617780804" Mar 22 00:37:34 crc kubenswrapper[5116]: I0322 00:37:34.697302 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:37:34 crc kubenswrapper[5116]: E0322 00:37:34.697840 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:37:47 crc kubenswrapper[5116]: I0322 00:37:47.697057 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:37:47 crc kubenswrapper[5116]: E0322 00:37:47.697935 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:37:57 crc kubenswrapper[5116]: I0322 00:37:57.932713 5116 scope.go:117] "RemoveContainer" containerID="9efa61524ace63b51ff35c7e96502efaaf7a080de1173648c97977985912b667" Mar 22 00:37:58 crc kubenswrapper[5116]: I0322 00:37:58.008256 5116 scope.go:117] "RemoveContainer" containerID="4106dafe916c442ff6a6c84ef7c764001d5f6257e8b17fde82563cb3dcaa24f7" Mar 22 00:37:58 crc kubenswrapper[5116]: I0322 00:37:58.084711 5116 scope.go:117] "RemoveContainer" containerID="095ee6b2ca6fc427f5a465f41b63b9951a947f316d689c28f653b52df20fd554" Mar 22 00:37:58 crc kubenswrapper[5116]: I0322 00:37:58.174536 5116 scope.go:117] "RemoveContainer" containerID="2a6933fce9cc74d356d1d5a231d547fdb2259fd5ca76515682cf2f100750a7ff" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.132786 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.140457 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.149297 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.224390 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29568998-4b5vm"] Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.229027 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.232052 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.232117 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.232409 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.236002 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568998-4b5vm"] Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.293329 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qsrm\" (UniqueName: \"kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm\") pod \"auto-csr-approver-29568998-4b5vm\" (UID: \"7926ef86-b333-4a8b-98ca-7a71240e7702\") " pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.293436 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t79vx\" (UniqueName: \"kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx\") pod \"infrawatch-operators-4fdg5\" (UID: \"1cb75cdb-9b93-44dd-9523-50b2c6f734d2\") " pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.394453 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t79vx\" (UniqueName: \"kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx\") pod \"infrawatch-operators-4fdg5\" (UID: \"1cb75cdb-9b93-44dd-9523-50b2c6f734d2\") " pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.394614 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qsrm\" (UniqueName: \"kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm\") pod \"auto-csr-approver-29568998-4b5vm\" (UID: \"7926ef86-b333-4a8b-98ca-7a71240e7702\") " pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.420904 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t79vx\" (UniqueName: \"kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx\") pod \"infrawatch-operators-4fdg5\" (UID: \"1cb75cdb-9b93-44dd-9523-50b2c6f734d2\") " pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.422140 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qsrm\" (UniqueName: \"kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm\") pod \"auto-csr-approver-29568998-4b5vm\" (UID: \"7926ef86-b333-4a8b-98ca-7a71240e7702\") " pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.477788 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.566136 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:00 crc kubenswrapper[5116]: I0322 00:38:00.898775 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:00 crc kubenswrapper[5116]: W0322 00:38:00.901032 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cb75cdb_9b93_44dd_9523_50b2c6f734d2.slice/crio-9d6c337411f7f14fd1031f34c36365fcdfe2d4ba51c5292c0732e26942495cb1 WatchSource:0}: Error finding container 9d6c337411f7f14fd1031f34c36365fcdfe2d4ba51c5292c0732e26942495cb1: Status 404 returned error can't find the container with id 9d6c337411f7f14fd1031f34c36365fcdfe2d4ba51c5292c0732e26942495cb1 Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.003520 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29568998-4b5vm"] Mar 22 00:38:01 crc kubenswrapper[5116]: W0322 00:38:01.008719 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7926ef86_b333_4a8b_98ca_7a71240e7702.slice/crio-a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871 WatchSource:0}: Error finding container a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871: Status 404 returned error can't find the container with id a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871 Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.697309 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:38:01 crc kubenswrapper[5116]: E0322 00:38:01.697959 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.825247 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" event={"ID":"7926ef86-b333-4a8b-98ca-7a71240e7702","Type":"ContainerStarted","Data":"a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871"} Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.826903 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4fdg5" event={"ID":"1cb75cdb-9b93-44dd-9523-50b2c6f734d2","Type":"ContainerStarted","Data":"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213"} Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.826968 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4fdg5" event={"ID":"1cb75cdb-9b93-44dd-9523-50b2c6f734d2","Type":"ContainerStarted","Data":"9d6c337411f7f14fd1031f34c36365fcdfe2d4ba51c5292c0732e26942495cb1"} Mar 22 00:38:01 crc kubenswrapper[5116]: I0322 00:38:01.847692 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-4fdg5" podStartSLOduration=1.681940601 podStartE2EDuration="1.84766172s" podCreationTimestamp="2026-03-22 00:38:00 +0000 UTC" firstStartedPulling="2026-03-22 00:38:00.90246418 +0000 UTC m=+1751.924765553" lastFinishedPulling="2026-03-22 00:38:01.068185299 +0000 UTC m=+1752.090486672" observedRunningTime="2026-03-22 00:38:01.840067417 +0000 UTC m=+1752.862368830" watchObservedRunningTime="2026-03-22 00:38:01.84766172 +0000 UTC m=+1752.869963133" Mar 22 00:38:02 crc kubenswrapper[5116]: I0322 00:38:02.837146 5116 generic.go:358] "Generic (PLEG): container finished" podID="7926ef86-b333-4a8b-98ca-7a71240e7702" containerID="f1799b324b05b5da6abc7fe5f1b11bb1a37a71d6056bbd615181dd99a25ef6af" exitCode=0 Mar 22 00:38:02 crc kubenswrapper[5116]: I0322 00:38:02.837318 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" event={"ID":"7926ef86-b333-4a8b-98ca-7a71240e7702","Type":"ContainerDied","Data":"f1799b324b05b5da6abc7fe5f1b11bb1a37a71d6056bbd615181dd99a25ef6af"} Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.168360 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.257510 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qsrm\" (UniqueName: \"kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm\") pod \"7926ef86-b333-4a8b-98ca-7a71240e7702\" (UID: \"7926ef86-b333-4a8b-98ca-7a71240e7702\") " Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.272138 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm" (OuterVolumeSpecName: "kube-api-access-2qsrm") pod "7926ef86-b333-4a8b-98ca-7a71240e7702" (UID: "7926ef86-b333-4a8b-98ca-7a71240e7702"). InnerVolumeSpecName "kube-api-access-2qsrm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.359087 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2qsrm\" (UniqueName: \"kubernetes.io/projected/7926ef86-b333-4a8b-98ca-7a71240e7702-kube-api-access-2qsrm\") on node \"crc\" DevicePath \"\"" Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.868549 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.868654 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29568998-4b5vm" event={"ID":"7926ef86-b333-4a8b-98ca-7a71240e7702","Type":"ContainerDied","Data":"a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871"} Mar 22 00:38:04 crc kubenswrapper[5116]: I0322 00:38:04.868692 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a87247cd2f29ccba482b91b2cddea5ec46b037133c222a2f869327986bd74871" Mar 22 00:38:05 crc kubenswrapper[5116]: I0322 00:38:05.244045 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568992-2pdr9"] Mar 22 00:38:05 crc kubenswrapper[5116]: I0322 00:38:05.254842 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568992-2pdr9"] Mar 22 00:38:05 crc kubenswrapper[5116]: I0322 00:38:05.706517 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9e5029-c0b6-4edb-a790-d67f5791bab2" path="/var/lib/kubelet/pods/6a9e5029-c0b6-4edb-a790-d67f5791bab2/volumes" Mar 22 00:38:10 crc kubenswrapper[5116]: I0322 00:38:10.478838 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:10 crc kubenswrapper[5116]: I0322 00:38:10.479437 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:10 crc kubenswrapper[5116]: I0322 00:38:10.520156 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:10 crc kubenswrapper[5116]: I0322 00:38:10.951661 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:10 crc kubenswrapper[5116]: I0322 00:38:10.992560 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:12 crc kubenswrapper[5116]: I0322 00:38:12.933094 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-4fdg5" podUID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" containerName="registry-server" containerID="cri-o://581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213" gracePeriod=2 Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.341634 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.373240 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t79vx\" (UniqueName: \"kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx\") pod \"1cb75cdb-9b93-44dd-9523-50b2c6f734d2\" (UID: \"1cb75cdb-9b93-44dd-9523-50b2c6f734d2\") " Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.394454 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx" (OuterVolumeSpecName: "kube-api-access-t79vx") pod "1cb75cdb-9b93-44dd-9523-50b2c6f734d2" (UID: "1cb75cdb-9b93-44dd-9523-50b2c6f734d2"). InnerVolumeSpecName "kube-api-access-t79vx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.474688 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t79vx\" (UniqueName: \"kubernetes.io/projected/1cb75cdb-9b93-44dd-9523-50b2c6f734d2-kube-api-access-t79vx\") on node \"crc\" DevicePath \"\"" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.941742 5116 generic.go:358] "Generic (PLEG): container finished" podID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" containerID="581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213" exitCode=0 Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.941865 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4fdg5" event={"ID":"1cb75cdb-9b93-44dd-9523-50b2c6f734d2","Type":"ContainerDied","Data":"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213"} Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.941883 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-4fdg5" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.941914 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-4fdg5" event={"ID":"1cb75cdb-9b93-44dd-9523-50b2c6f734d2","Type":"ContainerDied","Data":"9d6c337411f7f14fd1031f34c36365fcdfe2d4ba51c5292c0732e26942495cb1"} Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.941936 5116 scope.go:117] "RemoveContainer" containerID="581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.963772 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.969629 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-4fdg5"] Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.970294 5116 scope.go:117] "RemoveContainer" containerID="581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213" Mar 22 00:38:13 crc kubenswrapper[5116]: E0322 00:38:13.972771 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213\": container with ID starting with 581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213 not found: ID does not exist" containerID="581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213" Mar 22 00:38:13 crc kubenswrapper[5116]: I0322 00:38:13.972815 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213"} err="failed to get container status \"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213\": rpc error: code = NotFound desc = could not find container \"581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213\": container with ID starting with 581a79763b7f807a72364516169da279c0228c493b8b44c0b4d36b8aa3f5c213 not found: ID does not exist" Mar 22 00:38:14 crc kubenswrapper[5116]: I0322 00:38:14.697363 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:38:14 crc kubenswrapper[5116]: E0322 00:38:14.697778 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:38:15 crc kubenswrapper[5116]: I0322 00:38:15.709843 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" path="/var/lib/kubelet/pods/1cb75cdb-9b93-44dd-9523-50b2c6f734d2/volumes" Mar 22 00:38:19 crc kubenswrapper[5116]: I0322 00:38:19.514700 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-75ffdb6fcd-m65lb_0910a8a1-0226-42c8-ab1d-b142d2b7a00d/control-plane-machine-set-operator/0.log" Mar 22 00:38:19 crc kubenswrapper[5116]: I0322 00:38:19.639559 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-755bb95488-w2nq2_9884d9ba-fbeb-40db-8105-de302262478b/kube-rbac-proxy/0.log" Mar 22 00:38:19 crc kubenswrapper[5116]: I0322 00:38:19.684571 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-755bb95488-w2nq2_9884d9ba-fbeb-40db-8105-de302262478b/machine-api-operator/0.log" Mar 22 00:38:28 crc kubenswrapper[5116]: I0322 00:38:28.697558 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:38:28 crc kubenswrapper[5116]: E0322 00:38:28.698679 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:38:32 crc kubenswrapper[5116]: I0322 00:38:32.562389 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-759f64656b-nt75l_e6d082fa-fedb-4089-87be-6bd1f0922f14/cert-manager-controller/0.log" Mar 22 00:38:32 crc kubenswrapper[5116]: I0322 00:38:32.671359 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-8966b78d4-mnbt9_cee1607e-0fc4-4dfb-bcbb-a2b8d9636b69/cert-manager-cainjector/0.log" Mar 22 00:38:32 crc kubenswrapper[5116]: I0322 00:38:32.754517 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-597b96b99b-8tskc_57a02b12-f3d8-4264-af22-4fb7bc40602f/cert-manager-webhook/0.log" Mar 22 00:38:42 crc kubenswrapper[5116]: I0322 00:38:42.698033 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:38:42 crc kubenswrapper[5116]: E0322 00:38:42.698675 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:38:48 crc kubenswrapper[5116]: I0322 00:38:48.051004 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-55568fc96c-krbrc_1a434146-4e47-4733-9f73-955a4c92f2d2/prometheus-operator/0.log" Mar 22 00:38:48 crc kubenswrapper[5116]: I0322 00:38:48.163640 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd_399d0e55-3ad7-48ad-ab17-d0ab1fb9879f/prometheus-operator-admission-webhook/0.log" Mar 22 00:38:48 crc kubenswrapper[5116]: I0322 00:38:48.265880 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf_2d0f143c-b305-43e1-937e-020d84101219/prometheus-operator-admission-webhook/0.log" Mar 22 00:38:48 crc kubenswrapper[5116]: I0322 00:38:48.348487 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-587f9c8867-sxrpm_b998a8ef-dbc2-4004-a589-608b0bf774e7/operator/0.log" Mar 22 00:38:48 crc kubenswrapper[5116]: I0322 00:38:48.478367 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bff5dbc55-tpg7b_9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8/perses-operator/0.log" Mar 22 00:38:54 crc kubenswrapper[5116]: I0322 00:38:54.698244 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:38:54 crc kubenswrapper[5116]: E0322 00:38:54.698897 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:38:56 crc kubenswrapper[5116]: I0322 00:38:56.231897 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:38:56 crc kubenswrapper[5116]: I0322 00:38:56.232004 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9sq6c_5188f25b-37c3-46f1-b939-199c6e082848/kube-multus/0.log" Mar 22 00:38:56 crc kubenswrapper[5116]: I0322 00:38:56.239904 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:38:56 crc kubenswrapper[5116]: I0322 00:38:56.239981 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_9f0bc7fcb0822a2c13eb2d22cd8c0641/kube-controller-manager/0.log" Mar 22 00:38:58 crc kubenswrapper[5116]: I0322 00:38:58.307985 5116 scope.go:117] "RemoveContainer" containerID="b895d59dbbcf506c3c0f496201be78280c2900728c487398d81316e12e931fac" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.433425 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/util/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.586036 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/util/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.617094 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/pull/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.617122 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/pull/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.787098 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/pull/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.798523 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/util/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.803509 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqpwbn_c5f79273-4f52-4d9f-ab31-5af0123ff34c/extract/0.log" Mar 22 00:39:03 crc kubenswrapper[5116]: I0322 00:39:03.933311 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.104511 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.142457 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.145323 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.345734 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.370208 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/extract/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.374279 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etjhdr_01e2c74b-adcf-45a0-ab9a-e7375676f470/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.485874 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.667472 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.697864 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.770724 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.870880 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/util/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.896521 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/pull/0.log" Mar 22 00:39:04 crc kubenswrapper[5116]: I0322 00:39:04.957506 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e52g7w5_7e8a1901-425e-4555-a4f0-fd2ae65d7fb8/extract/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.069660 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/util/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.203601 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/pull/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.208733 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/util/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.217664 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/pull/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.367818 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/extract/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.497391 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/util/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.512151 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726lrb48_dd47ba40-7b8e-4f2c-8e16-62a5f085def8/pull/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.638747 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-utilities/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.792268 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-utilities/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.797328 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-content/0.log" Mar 22 00:39:05 crc kubenswrapper[5116]: I0322 00:39:05.806598 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.050241 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.108602 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.271865 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.275583 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qtstp_fb40c619-b024-485e-8fab-590cf66159b3/registry-server/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.390051 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.413577 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.449219 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.554796 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.568101 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.655296 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-547dbd544d-c8r5t_ba8a6a03-e32a-4121-86e1-d856ddf7a73b/marketplace-operator/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.790119 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.863845 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wkk2b_9cc37111-4983-4dbc-a277-b77d2fc47508/registry-server/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.973671 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-utilities/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.993747 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-content/0.log" Mar 22 00:39:06 crc kubenswrapper[5116]: I0322 00:39:06.998190 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-content/0.log" Mar 22 00:39:07 crc kubenswrapper[5116]: I0322 00:39:07.152797 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-content/0.log" Mar 22 00:39:07 crc kubenswrapper[5116]: I0322 00:39:07.162515 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/extract-utilities/0.log" Mar 22 00:39:07 crc kubenswrapper[5116]: I0322 00:39:07.375603 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v85n6_132f688e-74fb-4bbb-844a-a23467633e19/registry-server/0.log" Mar 22 00:39:07 crc kubenswrapper[5116]: I0322 00:39:07.697440 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:39:07 crc kubenswrapper[5116]: E0322 00:39:07.698773 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.706345 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:39:21 crc kubenswrapper[5116]: E0322 00:39:21.707216 5116 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-66g6d_openshift-machine-config-operator(9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3)\"" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" podUID="9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.763275 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-55568fc96c-krbrc_1a434146-4e47-4733-9f73-955a4c92f2d2/prometheus-operator/0.log" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.780649 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-54bdcb8d6-j2cnf_2d0f143c-b305-43e1-937e-020d84101219/prometheus-operator-admission-webhook/0.log" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.787451 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-54bdcb8d6-cvtsd_399d0e55-3ad7-48ad-ab17-d0ab1fb9879f/prometheus-operator-admission-webhook/0.log" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.882719 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-587f9c8867-sxrpm_b998a8ef-dbc2-4004-a589-608b0bf774e7/operator/0.log" Mar 22 00:39:21 crc kubenswrapper[5116]: I0322 00:39:21.914941 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bff5dbc55-tpg7b_9a6aaccb-a3e8-47e4-b74c-e4ec95a2f2e8/perses-operator/0.log" Mar 22 00:39:35 crc kubenswrapper[5116]: I0322 00:39:35.698333 5116 scope.go:117] "RemoveContainer" containerID="d0a7ecdaf494841a3b4314f37d116c49e106f51709624f7bc244ab8589c295bf" Mar 22 00:39:36 crc kubenswrapper[5116]: I0322 00:39:36.666256 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-66g6d" event={"ID":"9cf9c98b-4cbe-4918-b83d-ce7eaa3dabc3","Type":"ContainerStarted","Data":"495f94902224bb639caa02902e55778bab7314c3bab4e573936b06f73f196f83"} Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.145499 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29569000-4mj96"] Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.146898 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7926ef86-b333-4a8b-98ca-7a71240e7702" containerName="oc" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.146915 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="7926ef86-b333-4a8b-98ca-7a71240e7702" containerName="oc" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.146946 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" containerName="registry-server" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.146954 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" containerName="registry-server" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.147095 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="1cb75cdb-9b93-44dd-9523-50b2c6f734d2" containerName="registry-server" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.147115 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="7926ef86-b333-4a8b-98ca-7a71240e7702" containerName="oc" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.152299 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.155217 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"openshift-service-ca.crt\"" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.156422 5116 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-infra\"/\"csr-approver-sa-dockercfg-zsw2q\"" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.161598 5116 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-infra\"/\"kube-root-ca.crt\"" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.168317 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29569000-4mj96"] Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.286026 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg95k\" (UniqueName: \"kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k\") pod \"auto-csr-approver-29569000-4mj96\" (UID: \"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2\") " pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.387434 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tg95k\" (UniqueName: \"kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k\") pod \"auto-csr-approver-29569000-4mj96\" (UID: \"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2\") " pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.413601 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg95k\" (UniqueName: \"kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k\") pod \"auto-csr-approver-29569000-4mj96\" (UID: \"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2\") " pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.492931 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.727299 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29569000-4mj96"] Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.729872 5116 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 22 00:40:00 crc kubenswrapper[5116]: I0322 00:40:00.914909 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29569000-4mj96" event={"ID":"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2","Type":"ContainerStarted","Data":"a166a3f12307eda4d9559f27f7cae2f024cea0472280b827ba50dca2b4dbe2b5"} Mar 22 00:40:02 crc kubenswrapper[5116]: I0322 00:40:02.935663 5116 generic.go:358] "Generic (PLEG): container finished" podID="2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2" containerID="7c29c71b7a58a810ccca621c67305721af7b2030d4cd67fc359b6cb2c54817ad" exitCode=0 Mar 22 00:40:02 crc kubenswrapper[5116]: I0322 00:40:02.936295 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29569000-4mj96" event={"ID":"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2","Type":"ContainerDied","Data":"7c29c71b7a58a810ccca621c67305721af7b2030d4cd67fc359b6cb2c54817ad"} Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.312901 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.455839 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg95k\" (UniqueName: \"kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k\") pod \"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2\" (UID: \"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2\") " Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.464844 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k" (OuterVolumeSpecName: "kube-api-access-tg95k") pod "2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2" (UID: "2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2"). InnerVolumeSpecName "kube-api-access-tg95k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.557206 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tg95k\" (UniqueName: \"kubernetes.io/projected/2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2-kube-api-access-tg95k\") on node \"crc\" DevicePath \"\"" Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.969841 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29569000-4mj96" event={"ID":"2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2","Type":"ContainerDied","Data":"a166a3f12307eda4d9559f27f7cae2f024cea0472280b827ba50dca2b4dbe2b5"} Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.970304 5116 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a166a3f12307eda4d9559f27f7cae2f024cea0472280b827ba50dca2b4dbe2b5" Mar 22 00:40:04 crc kubenswrapper[5116]: I0322 00:40:04.969979 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29569000-4mj96" Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.413008 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29568994-tx27l"] Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.423229 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29568994-tx27l"] Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.707312 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7f4a688-3ca1-4538-9b16-323899848ec1" path="/var/lib/kubelet/pods/a7f4a688-3ca1-4538-9b16-323899848ec1/volumes" Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.983241 5116 generic.go:358] "Generic (PLEG): container finished" podID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerID="d73ca44ab765dc4012a9020a414c305067b5524903bd0e1b9b94504288d3f864" exitCode=0 Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.983322 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" event={"ID":"4fa843c6-f428-4c50-8fb6-58db4910b8d8","Type":"ContainerDied","Data":"d73ca44ab765dc4012a9020a414c305067b5524903bd0e1b9b94504288d3f864"} Mar 22 00:40:05 crc kubenswrapper[5116]: I0322 00:40:05.984049 5116 scope.go:117] "RemoveContainer" containerID="d73ca44ab765dc4012a9020a414c305067b5524903bd0e1b9b94504288d3f864" Mar 22 00:40:06 crc kubenswrapper[5116]: I0322 00:40:06.416422 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mmn2b_must-gather-pxgv6_4fa843c6-f428-4c50-8fb6-58db4910b8d8/gather/0.log" Mar 22 00:40:12 crc kubenswrapper[5116]: I0322 00:40:12.767026 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mmn2b/must-gather-pxgv6"] Mar 22 00:40:12 crc kubenswrapper[5116]: I0322 00:40:12.769898 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="copy" containerID="cri-o://c58c0ca6ac202eff124ef375a1ba501807cfd8934d46c29180611db4363c333e" gracePeriod=2 Mar 22 00:40:12 crc kubenswrapper[5116]: I0322 00:40:12.773448 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mmn2b/must-gather-pxgv6"] Mar 22 00:40:12 crc kubenswrapper[5116]: I0322 00:40:12.774759 5116 status_manager.go:895] "Failed to get status for pod" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" err="pods \"must-gather-pxgv6\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-mmn2b\": no relationship found between node 'crc' and this object" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.067435 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mmn2b_must-gather-pxgv6_4fa843c6-f428-4c50-8fb6-58db4910b8d8/copy/0.log" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.068694 5116 generic.go:358] "Generic (PLEG): container finished" podID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerID="c58c0ca6ac202eff124ef375a1ba501807cfd8934d46c29180611db4363c333e" exitCode=143 Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.186207 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mmn2b_must-gather-pxgv6_4fa843c6-f428-4c50-8fb6-58db4910b8d8/copy/0.log" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.186588 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.187973 5116 status_manager.go:895] "Failed to get status for pod" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" err="pods \"must-gather-pxgv6\" is forbidden: User \"system:node:crc\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-mmn2b\": no relationship found between node 'crc' and this object" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.320412 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output\") pod \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.320539 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czwdw\" (UniqueName: \"kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw\") pod \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\" (UID: \"4fa843c6-f428-4c50-8fb6-58db4910b8d8\") " Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.326395 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw" (OuterVolumeSpecName: "kube-api-access-czwdw") pod "4fa843c6-f428-4c50-8fb6-58db4910b8d8" (UID: "4fa843c6-f428-4c50-8fb6-58db4910b8d8"). InnerVolumeSpecName "kube-api-access-czwdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.373196 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4fa843c6-f428-4c50-8fb6-58db4910b8d8" (UID: "4fa843c6-f428-4c50-8fb6-58db4910b8d8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.421963 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-czwdw\" (UniqueName: \"kubernetes.io/projected/4fa843c6-f428-4c50-8fb6-58db4910b8d8-kube-api-access-czwdw\") on node \"crc\" DevicePath \"\"" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.421997 5116 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4fa843c6-f428-4c50-8fb6-58db4910b8d8-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 22 00:40:13 crc kubenswrapper[5116]: I0322 00:40:13.706036 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" path="/var/lib/kubelet/pods/4fa843c6-f428-4c50-8fb6-58db4910b8d8/volumes" Mar 22 00:40:14 crc kubenswrapper[5116]: I0322 00:40:14.077621 5116 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mmn2b_must-gather-pxgv6_4fa843c6-f428-4c50-8fb6-58db4910b8d8/copy/0.log" Mar 22 00:40:14 crc kubenswrapper[5116]: I0322 00:40:14.078726 5116 scope.go:117] "RemoveContainer" containerID="c58c0ca6ac202eff124ef375a1ba501807cfd8934d46c29180611db4363c333e" Mar 22 00:40:14 crc kubenswrapper[5116]: I0322 00:40:14.078857 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mmn2b/must-gather-pxgv6" Mar 22 00:40:14 crc kubenswrapper[5116]: I0322 00:40:14.095767 5116 scope.go:117] "RemoveContainer" containerID="d73ca44ab765dc4012a9020a414c305067b5524903bd0e1b9b94504288d3f864" Mar 22 00:40:58 crc kubenswrapper[5116]: I0322 00:40:58.493594 5116 scope.go:117] "RemoveContainer" containerID="1109d2350dd4889e7482fa6160c73f85e35ef58464b38a92e5a8ce5b9d9fcea4" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.128577 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129689 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="copy" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129701 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="copy" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129727 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2" containerName="oc" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129734 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2" containerName="oc" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129751 5116 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="gather" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129756 5116 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="gather" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129842 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="gather" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129852 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="4fa843c6-f428-4c50-8fb6-58db4910b8d8" containerName="copy" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.129869 5116 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ed965b4-6a1e-4dd1-a9ab-c45f6c28b5a2" containerName="oc" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.139959 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.150535 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.265606 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.265691 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrg7g\" (UniqueName: \"kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.265831 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.306855 5116 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.314387 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.318990 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.367623 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.367693 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrg7g\" (UniqueName: \"kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.367754 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.368245 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.368409 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.389513 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrg7g\" (UniqueName: \"kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g\") pod \"certified-operators-h7bwj\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.469649 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.469780 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqw7l\" (UniqueName: \"kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.469850 5116 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.493589 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.571569 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqw7l\" (UniqueName: \"kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.571846 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.571953 5116 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.572477 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.572549 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.593906 5116 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqw7l\" (UniqueName: \"kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l\") pod \"redhat-operators-hmpvb\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.636735 5116 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.868313 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:03 crc kubenswrapper[5116]: I0322 00:41:03.991322 5116 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:03 crc kubenswrapper[5116]: W0322 00:41:03.994889 5116 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80416337_e9da_4316_8e19_60269a38f953.slice/crio-248003aa770056ad2c91a568b2f925b77180d76c45127cfcc7f5f08a1ea41521 WatchSource:0}: Error finding container 248003aa770056ad2c91a568b2f925b77180d76c45127cfcc7f5f08a1ea41521: Status 404 returned error can't find the container with id 248003aa770056ad2c91a568b2f925b77180d76c45127cfcc7f5f08a1ea41521 Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.549726 5116 generic.go:358] "Generic (PLEG): container finished" podID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerID="9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2" exitCode=0 Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.549805 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerDied","Data":"9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2"} Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.550202 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerStarted","Data":"c706029a199b983363e08ea2f49456a42ed0bfa1c14b04db0528466c7004cb3d"} Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.554020 5116 generic.go:358] "Generic (PLEG): container finished" podID="80416337-e9da-4316-8e19-60269a38f953" containerID="74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967" exitCode=0 Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.554108 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerDied","Data":"74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967"} Mar 22 00:41:04 crc kubenswrapper[5116]: I0322 00:41:04.554139 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerStarted","Data":"248003aa770056ad2c91a568b2f925b77180d76c45127cfcc7f5f08a1ea41521"} Mar 22 00:41:05 crc kubenswrapper[5116]: I0322 00:41:05.568752 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerStarted","Data":"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693"} Mar 22 00:41:06 crc kubenswrapper[5116]: I0322 00:41:06.580664 5116 generic.go:358] "Generic (PLEG): container finished" podID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerID="1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172" exitCode=0 Mar 22 00:41:06 crc kubenswrapper[5116]: I0322 00:41:06.580809 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerDied","Data":"1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172"} Mar 22 00:41:06 crc kubenswrapper[5116]: I0322 00:41:06.587245 5116 generic.go:358] "Generic (PLEG): container finished" podID="80416337-e9da-4316-8e19-60269a38f953" containerID="a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693" exitCode=0 Mar 22 00:41:06 crc kubenswrapper[5116]: I0322 00:41:06.587615 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerDied","Data":"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693"} Mar 22 00:41:07 crc kubenswrapper[5116]: I0322 00:41:07.599754 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerStarted","Data":"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec"} Mar 22 00:41:07 crc kubenswrapper[5116]: I0322 00:41:07.605307 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerStarted","Data":"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f"} Mar 22 00:41:07 crc kubenswrapper[5116]: I0322 00:41:07.639676 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hmpvb" podStartSLOduration=3.800037667 podStartE2EDuration="4.639651474s" podCreationTimestamp="2026-03-22 00:41:03 +0000 UTC" firstStartedPulling="2026-03-22 00:41:04.550736913 +0000 UTC m=+1935.573038286" lastFinishedPulling="2026-03-22 00:41:05.39035071 +0000 UTC m=+1936.412652093" observedRunningTime="2026-03-22 00:41:07.621802607 +0000 UTC m=+1938.644104020" watchObservedRunningTime="2026-03-22 00:41:07.639651474 +0000 UTC m=+1938.661952857" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.493895 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.494325 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.559456 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.597809 5116 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h7bwj" podStartSLOduration=9.867506081 podStartE2EDuration="10.597781571s" podCreationTimestamp="2026-03-22 00:41:03 +0000 UTC" firstStartedPulling="2026-03-22 00:41:04.555750666 +0000 UTC m=+1935.578052079" lastFinishedPulling="2026-03-22 00:41:05.286026196 +0000 UTC m=+1936.308327569" observedRunningTime="2026-03-22 00:41:07.644883706 +0000 UTC m=+1938.667185089" watchObservedRunningTime="2026-03-22 00:41:13.597781571 +0000 UTC m=+1944.620082954" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.637803 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.637887 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.712199 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:13 crc kubenswrapper[5116]: I0322 00:41:13.812092 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:14 crc kubenswrapper[5116]: I0322 00:41:14.695613 5116 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hmpvb" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="registry-server" probeResult="failure" output=< Mar 22 00:41:14 crc kubenswrapper[5116]: timeout: failed to connect service ":50051" within 1s Mar 22 00:41:14 crc kubenswrapper[5116]: > Mar 22 00:41:15 crc kubenswrapper[5116]: I0322 00:41:15.680487 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h7bwj" podUID="80416337-e9da-4316-8e19-60269a38f953" containerName="registry-server" containerID="cri-o://cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f" gracePeriod=2 Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.633544 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.689924 5116 generic.go:358] "Generic (PLEG): container finished" podID="80416337-e9da-4316-8e19-60269a38f953" containerID="cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f" exitCode=0 Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690038 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7bwj" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690051 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerDied","Data":"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f"} Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690124 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7bwj" event={"ID":"80416337-e9da-4316-8e19-60269a38f953","Type":"ContainerDied","Data":"248003aa770056ad2c91a568b2f925b77180d76c45127cfcc7f5f08a1ea41521"} Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690156 5116 scope.go:117] "RemoveContainer" containerID="cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690513 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities\") pod \"80416337-e9da-4316-8e19-60269a38f953\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690692 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content\") pod \"80416337-e9da-4316-8e19-60269a38f953\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.690719 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrg7g\" (UniqueName: \"kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g\") pod \"80416337-e9da-4316-8e19-60269a38f953\" (UID: \"80416337-e9da-4316-8e19-60269a38f953\") " Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.691711 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities" (OuterVolumeSpecName: "utilities") pod "80416337-e9da-4316-8e19-60269a38f953" (UID: "80416337-e9da-4316-8e19-60269a38f953"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.696290 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g" (OuterVolumeSpecName: "kube-api-access-jrg7g") pod "80416337-e9da-4316-8e19-60269a38f953" (UID: "80416337-e9da-4316-8e19-60269a38f953"). InnerVolumeSpecName "kube-api-access-jrg7g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.730882 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80416337-e9da-4316-8e19-60269a38f953" (UID: "80416337-e9da-4316-8e19-60269a38f953"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.748535 5116 scope.go:117] "RemoveContainer" containerID="a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.775614 5116 scope.go:117] "RemoveContainer" containerID="74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.792472 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.792510 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80416337-e9da-4316-8e19-60269a38f953-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.792522 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jrg7g\" (UniqueName: \"kubernetes.io/projected/80416337-e9da-4316-8e19-60269a38f953-kube-api-access-jrg7g\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.797019 5116 scope.go:117] "RemoveContainer" containerID="cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f" Mar 22 00:41:16 crc kubenswrapper[5116]: E0322 00:41:16.797695 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f\": container with ID starting with cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f not found: ID does not exist" containerID="cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.797748 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f"} err="failed to get container status \"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f\": rpc error: code = NotFound desc = could not find container \"cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f\": container with ID starting with cc9c92696096c656dafa1bb5262bfb4753319fa68fab3c09fff4d408f1c0c86f not found: ID does not exist" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.797768 5116 scope.go:117] "RemoveContainer" containerID="a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693" Mar 22 00:41:16 crc kubenswrapper[5116]: E0322 00:41:16.798098 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693\": container with ID starting with a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693 not found: ID does not exist" containerID="a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.798183 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693"} err="failed to get container status \"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693\": rpc error: code = NotFound desc = could not find container \"a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693\": container with ID starting with a45b77b25e96df84084740fb37f0eef39490e2085a93a6843097dcc84407e693 not found: ID does not exist" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.798215 5116 scope.go:117] "RemoveContainer" containerID="74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967" Mar 22 00:41:16 crc kubenswrapper[5116]: E0322 00:41:16.798516 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967\": container with ID starting with 74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967 not found: ID does not exist" containerID="74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967" Mar 22 00:41:16 crc kubenswrapper[5116]: I0322 00:41:16.798580 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967"} err="failed to get container status \"74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967\": rpc error: code = NotFound desc = could not find container \"74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967\": container with ID starting with 74761d0d90a625a5e234b55e19cf91a539ebd7cec6dbfcc7e7db5eb028760967 not found: ID does not exist" Mar 22 00:41:17 crc kubenswrapper[5116]: I0322 00:41:17.031699 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:17 crc kubenswrapper[5116]: I0322 00:41:17.041152 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h7bwj"] Mar 22 00:41:17 crc kubenswrapper[5116]: I0322 00:41:17.711439 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80416337-e9da-4316-8e19-60269a38f953" path="/var/lib/kubelet/pods/80416337-e9da-4316-8e19-60269a38f953/volumes" Mar 22 00:41:23 crc kubenswrapper[5116]: I0322 00:41:23.714837 5116 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:23 crc kubenswrapper[5116]: I0322 00:41:23.787598 5116 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:23 crc kubenswrapper[5116]: I0322 00:41:23.958803 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:24 crc kubenswrapper[5116]: I0322 00:41:24.785838 5116 kuberuntime_container.go:858] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hmpvb" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerName="registry-server" containerID="cri-o://a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec" gracePeriod=2 Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.248191 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.338764 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content\") pod \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.338982 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities\") pod \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.339045 5116 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqw7l\" (UniqueName: \"kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l\") pod \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\" (UID: \"bb7db85b-fa66-42db-bf71-1fe9b38656d0\") " Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.341087 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities" (OuterVolumeSpecName: "utilities") pod "bb7db85b-fa66-42db-bf71-1fe9b38656d0" (UID: "bb7db85b-fa66-42db-bf71-1fe9b38656d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.348542 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l" (OuterVolumeSpecName: "kube-api-access-bqw7l") pod "bb7db85b-fa66-42db-bf71-1fe9b38656d0" (UID: "bb7db85b-fa66-42db-bf71-1fe9b38656d0"). InnerVolumeSpecName "kube-api-access-bqw7l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.440483 5116 reconciler_common.go:299] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.440521 5116 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqw7l\" (UniqueName: \"kubernetes.io/projected/bb7db85b-fa66-42db-bf71-1fe9b38656d0-kube-api-access-bqw7l\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.489905 5116 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb7db85b-fa66-42db-bf71-1fe9b38656d0" (UID: "bb7db85b-fa66-42db-bf71-1fe9b38656d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.542442 5116 reconciler_common.go:299] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb7db85b-fa66-42db-bf71-1fe9b38656d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.800020 5116 generic.go:358] "Generic (PLEG): container finished" podID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" containerID="a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec" exitCode=0 Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.800406 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerDied","Data":"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec"} Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.800447 5116 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmpvb" event={"ID":"bb7db85b-fa66-42db-bf71-1fe9b38656d0","Type":"ContainerDied","Data":"c706029a199b983363e08ea2f49456a42ed0bfa1c14b04db0528466c7004cb3d"} Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.800479 5116 scope.go:117] "RemoveContainer" containerID="a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.800759 5116 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmpvb" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.830498 5116 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.832335 5116 scope.go:117] "RemoveContainer" containerID="1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.841599 5116 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hmpvb"] Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.854893 5116 scope.go:117] "RemoveContainer" containerID="9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.883723 5116 scope.go:117] "RemoveContainer" containerID="a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec" Mar 22 00:41:25 crc kubenswrapper[5116]: E0322 00:41:25.884293 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec\": container with ID starting with a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec not found: ID does not exist" containerID="a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.884364 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec"} err="failed to get container status \"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec\": rpc error: code = NotFound desc = could not find container \"a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec\": container with ID starting with a52f0f85ecfca25587ee4e799f45a417064887df99c53023b059ef8de18c69ec not found: ID does not exist" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.884404 5116 scope.go:117] "RemoveContainer" containerID="1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172" Mar 22 00:41:25 crc kubenswrapper[5116]: E0322 00:41:25.884916 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172\": container with ID starting with 1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172 not found: ID does not exist" containerID="1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.884958 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172"} err="failed to get container status \"1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172\": rpc error: code = NotFound desc = could not find container \"1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172\": container with ID starting with 1437f50b92cfa38489e0ac8d4e9751b959af8284a6f09cd644fac44f01cbc172 not found: ID does not exist" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.884983 5116 scope.go:117] "RemoveContainer" containerID="9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2" Mar 22 00:41:25 crc kubenswrapper[5116]: E0322 00:41:25.885561 5116 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2\": container with ID starting with 9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2 not found: ID does not exist" containerID="9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2" Mar 22 00:41:25 crc kubenswrapper[5116]: I0322 00:41:25.885617 5116 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2"} err="failed to get container status \"9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2\": rpc error: code = NotFound desc = could not find container \"9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2\": container with ID starting with 9d697664ac6e1358387eb796015fa0631e395dd7a2b335ec3240fa86e48761a2 not found: ID does not exist" Mar 22 00:41:27 crc kubenswrapper[5116]: I0322 00:41:27.713008 5116 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7db85b-fa66-42db-bf71-1fe9b38656d0" path="/var/lib/kubelet/pods/bb7db85b-fa66-42db-bf71-1fe9b38656d0/volumes"